Exact and Approximate MCMC for Doubly-intractable Probabilistic Graphical Models Leveraging the Underlying Independence Model

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bayesian inference for doubly-intractable probabilistic graphical models—where both the likelihood and prior lack closed-form expressions—remains challenging, as existing MCMC methods rely on perfect or sequential samplers that scale poorly to high-dimensional, complex models. Method: We propose a generic unbiased MCMC framework that requires no underlying sampler; instead, it constructs an unbiased Monte Carlo estimator of the Metropolis–Hastings acceptance ratio using tractable independence models. The approach naturally unifies exact and approximate MCMC and seamlessly incorporates gradient-driven proposal mechanisms (e.g., Langevin and Hamiltonian dynamics). Results: Experiments on the Ising model demonstrate substantial improvements in mixing efficiency and scalability in high dimensions, eliminating reliance on intricate auxiliary samplers. Our method provides the first practical, robust, and differentiable Bayesian inference solution for doubly-intractable models.

Technology Category

Application Category

📝 Abstract
Bayesian inference for doubly-intractable probabilistic graphical models typically involves variations of the exchange algorithm or approximate Markov chain Monte Carlo (MCMC) samplers. However, existing methods for both classes of algorithms require either perfect samplers or sequential samplers for complex models, which are often either not available, or suffer from poor mixing, especially in high dimensions. We develop a method that does not require perfect or sequential sampling, and can be applied to both classes of methods: exact and approximate MCMC. The key to our approach is to utilize the tractable independence model underlying an intractable probabilistic graphical model for the purpose of constructing a finite sample unbiased Monte Carlo (and not MCMC) estimate of the Metropolis--Hastings ratio. This innovation turns out to be crucial for scalability in high dimensions. The method is demonstrated on the Ising model. Gradient-based alternatives to construct a proposal, such as Langevin and Hamiltonian Monte Carlo approaches, also arise as a natural corollary to our general procedure, and are demonstrated as well.
Problem

Research questions and friction points this paper is trying to address.

Developing MCMC methods without perfect sampling requirements
Addressing scalability issues in high-dimensional graphical models
Enabling unbiased estimation for doubly-intractable Bayesian inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages tractable independence model for unbiased estimates
Eliminates need for perfect or sequential sampling
Enables gradient-based proposals like Langevin and Hamiltonian MC
🔎 Similar Papers
Y
Yujie Chen
Department of Statistics, Purdue University
A
Antik Chakraborty
Department of Statistics, Purdue University
Anindya Bhadra
Anindya Bhadra
Professor of Statistics, Purdue University
Bayesian methodsHigh-dimensional and complex dataComputational statistics