Transport meets Variational Inference: Controlled Monte Carlo Diffusions

📅 2023-07-03
🏛️ International Conference on Learning Representations
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the inefficiency of path-space sampling and generative modeling in Bayesian computation by proposing a path-space divergence framework that integrates optimal transport and variational inference. Methodologically, it (1) introduces the Jarzynski/Crooks equality into diffusion sampling, yielding Controllable Monte Carlo Diffusion (CMCD), a novel sampler with provable bias–variance trade-offs; (2) unifies the EM algorithm and Schrödinger bridge-based Iterative Proportional Fitting (IPF), incorporating a regularized objective to circumvent IPF’s slow convergence and numerical instability; and (3) jointly optimizes forward/backward diffusion dynamics and score estimation via a unified variational principle. Experiments across diverse Bayesian inference tasks—including posterior sampling, marginal likelihood estimation, and latent variable modeling—demonstrate that CMCD consistently outperforms state-of-the-art diffusion models and MCMC methods in both sample quality and computational efficiency, while maintaining theoretical rigor and empirical superiority.
📝 Abstract
Connecting optimal transport and variational inference, we present a principled and systematic framework for sampling and generative modelling centred around divergences on path space. Our work culminates in the development of the emph{Controlled Monte Carlo Diffusion} sampler (CMCD) for Bayesian computation, a score-based annealing technique that crucially adapts both forward and backward dynamics in a diffusion model. On the way, we clarify the relationship between the EM-algorithm and iterative proportional fitting (IPF) for Schr{""o}dinger bridges, deriving as well a regularised objective that bypasses the iterative bottleneck of standard IPF-updates. Finally, we show that CMCD has a strong foundation in the Jarzinsky and Crooks identities from statistical physics, and that it convincingly outperforms competing approaches across a wide array of experiments.
Problem

Research questions and friction points this paper is trying to address.

Connects optimal transport and variational inference for sampling
Develops Controlled Monte Carlo Diffusion sampler for Bayesian computation
Clarifies relationship between EM-algorithm and iterative proportional fitting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Connects optimal transport and variational inference
Develops Controlled Monte Carlo Diffusion sampler
Regularized objective bypasses iterative IPF bottleneck
🔎 Similar Papers