Group Spike-and-Slab Variational Bayes

📅 2023-09-19
🏛️ Bayesian Analysis
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses group-sparse regression by proposing GSVB, a scalable variational Bayesian framework for generalized linear models with Gaussian, binomial, and Poisson responses. GSVB employs a group-wise spike-and-slab prior and coordinate-ascent variational inference (CAVI). It establishes, for the first time, asymptotic posterior contraction guarantees for variational approximations under group sparsity. Compared to existing maximum-a-posteriori (MAP) methods, GSVB achieves substantially improved variable selection accuracy and more reliable uncertainty quantification. Relative to Markov chain Monte Carlo (MCMC), it delivers orders-of-magnitude speedup (e.g., 10–50× faster) while matching or exceeding MCMC in predictive and inferential performance. Extensive experiments—including simulations and three real-world datasets—demonstrate state-of-the-art results. GSVB thus bridges statistical rigor and computational scalability, enabling robust, efficient, and interpretable group-sparse learning at scale.
📝 Abstract
We introduce Group Spike-and-slab Variational Bayes (GSVB), a scalable method for group sparse regression. A fast co-ordinate ascent variational inference (CAVI) algorithm is developed for several common model families including Gaussian, Binomial and Poisson. Theoretical guarantees for our proposed approach are provided by deriving contraction rates for the variational posterior in grouped linear regression. Through extensive numerical studies, we demonstrate that GSVB provides state-of-the-art performance, offering a computationally inexpensive substitute to MCMC, whilst performing comparably or better than existing MAP methods. Additionally, we analyze three real world datasets wherein we highlight the practical utility of our method, demonstrating that GSVB provides parsimonious models with excellent predictive performance, variable selection and uncertainty quantification.
Problem

Research questions and friction points this paper is trying to address.

Develops scalable group sparse regression method with variational inference
Provides theoretical guarantees for variational posterior contraction rates
Offers computationally efficient alternative to MCMC with competitive performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Group spike-and-slab variational Bayes for sparse regression
Coordinate ascent variational inference for multiple models
Computationally efficient alternative to MCMC with guarantees
🔎 Similar Papers
No similar papers found.