🤖 AI Summary
This work addresses the inefficiency and slow convergence of traditional Markov chain Monte Carlo (MCMC) methods—such as reversible jump or birth–death algorithms—in exploring high-dimensional binary model spaces. To overcome these limitations, the authors propose the Multiple Jump MCMC algorithm, which enables multi-step structural jumps without requiring rejection sampling, thereby dramatically accelerating exploration of the model space while rigorously preserving posterior accuracy. The method features a simple and general structure, making it readily applicable to a range of Bayesian inference problems, including graphical models, variable selection, and mixture distributions. Empirical evaluations on undirected Gaussian graphical models demonstrate that the proposed approach achieves a 100–200× speedup over state-of-the-art methods, efficiently handling models with up to 500,000 parameters within one minute.
📝 Abstract
This article considers Bayesian model inference on binary model spaces. Binary model spaces are used by a large class of models that include graphical models, variable selection, mixture distributions, and decision trees. Traditional strategies in this field, such as reversible jump or birth-death MCMC algorithms, are still popular, despite suffering from a slow exploration of the model space. In this article, we propose an alternative: the Multiple Jump MCMC algorithm. The algorithm is simple, rejection-free, and remarkably fast. When applied to undirected Gaussian graphical models, it is $100$ to $200$ times faster than the state-of-the art, solving models with $500,000$ parameters in less than a minute. We provide theorems showing how accurately our algorithm targets the posterior, and we apply our framework to Gaussian graphical models, Ising models and variable selection, but note that it applies to most Bayesian posterior inference on binary model spaces.