Parameter Expanded Stochastic Gradient Markov Chain Monte Carlo

๐Ÿ“… 2025-03-02
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Bayesian neural networks (BNNs) rely on posterior sampling for uncertainty quantification and out-of-distribution (OOD) robustness, yet stochastic gradient Markov chain Monte Carlo (SGMCMC) methods often suffer from insufficient sample diversity, leading to biased posterior estimates. To address this, we propose a parameter-expansion reparameterization strategy: the weight matrix is factorized into a product of lower-rank matrices, enabling enhanced trajectory exploration and faster mixing of SGMCMC chainsโ€”without increasing computational cost, temperature scaling, or multi-chain parallelism. Our approach unifies matrix factorization, Langevin dynamics, and Bayesian inference, with theoretical guarantees of posterior consistency. Experiments on image classification demonstrate substantial improvements over standard SGMCMC and Hamiltonian Monte Carlo: higher OOD detection accuracy, greater sample diversity, broader coverage of the loss landscape, and unchanged inference overhead.

Technology Category

Application Category

๐Ÿ“ Abstract
Bayesian Neural Networks (BNNs) provide a promising framework for modeling predictive uncertainty and enhancing out-of-distribution robustness (OOD) by estimating the posterior distribution of network parameters. Stochastic Gradient Markov Chain Monte Carlo (SGMCMC) is one of the most powerful methods for scalable posterior sampling in BNNs, achieving efficiency by combining stochastic gradient descent with second-order Langevin dynamics. However, SGMCMC often suffers from limited sample diversity in practice, which affects uncertainty estimation and model performance. We propose a simple yet effective approach to enhance sample diversity in SGMCMC without the need for tempering or running multiple chains. Our approach reparameterizes the neural network by decomposing each of its weight matrices into a product of matrices, resulting in a sampling trajectory that better explores the target parameter space. This approach produces a more diverse set of samples, allowing faster mixing within the same computational budget. Notably, our sampler achieves these improvements without increasing the inference cost compared to the standard SGMCMC. Extensive experiments on image classification tasks, including OOD robustness, diversity, loss surface analyses, and a comparative study with Hamiltonian Monte Carlo, demonstrate the superiority of the proposed approach.
Problem

Research questions and friction points this paper is trying to address.

Enhance sample diversity in SGMCMC for BNNs
Improve uncertainty estimation and model performance
Achieve faster mixing without increasing inference cost
Innovation

Methods, ideas, or system contributions that make the work stand out.

Reparameterizes neural network weight matrices
Enhances sample diversity in SGMCMC
Maintains inference cost efficiency
๐Ÿ”Ž Similar Papers
No similar papers found.