A Flexible Empirical Bayes Approach to Generalized Linear Models, with Applications to Sparse Logistic Regression

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a novel empirical Bayes approach to address the challenges of hyperparameter tuning and limited flexibility in Bayesian inference for generalized linear models. Departing from conventional variational inference, which explicitly models the posterior density, the method directly optimizes the posterior mean and prior hyperparameters without assuming a Gaussian variational family or requiring manual tuning. By integrating mean-field variational inference with L-BFGS or stochastic gradient optimization, the framework provides a unified and scalable solution compatible with exponential-family likelihoods and a broad class of priors. Empirical evaluation on sparse logistic regression demonstrates that the proposed method achieves superior predictive performance compared to existing state-of-the-art approaches.

Technology Category

Application Category

📝 Abstract
We introduce a flexible empirical Bayes approach for fitting Bayesian generalized linear models. Specifically, we adopt a novel mean-field variational inference (VI) method and the prior is estimated within the VI algorithm, making the method tuning-free. Unlike traditional VI methods that optimize the posterior density function, our approach directly optimizes the posterior mean and prior parameters. This formulation reduces the number of parameters to optimize and enables the use of scalable algorithms such as L-BFGS and stochastic gradient descent. Furthermore, our method automatically determines the optimal posterior based on the prior and likelihood, distinguishing it from existing VI methods that often assume a Gaussian variational. Our approach represents a unified framework applicable to a wide range of exponential family distributions, removing the need to develop unique VI methods for each combination of likelihood and prior distributions. We apply the framework to solve sparse logistic regression and demonstrate the superior predictive performance of our method in extensive numerical studies, by comparing it to prevalent sparse logistic regression approaches.
Problem

Research questions and friction points this paper is trying to address.

empirical Bayes
generalized linear models
sparse logistic regression
variational inference
posterior approximation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Empirical Bayes
Variational Inference
Generalized Linear Models
Sparse Logistic Regression
Tuning-free