Asymptotically exact variational flows via involutive MCMC kernels

📅 2025-06-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing expressive variational flows—such as normalizing flows—lack practical convergence guarantees; their theoretical analyses often rely on unrealistic assumptions of global optimality. Method: We propose a novel class of parameter-free, asymptotically exact variational flows by establishing an equivalence between involutive MCMC kernels and reversible, measure-preserving iterated random function systems (IRFS), enabling the first rigorous total variation (TV) convergence guarantee for variational inference. Contribution/Results: (1) We construct three new variational families that require no hyperparameter tuning, depend only on weak assumptions, and possess provable TV convergence; (2) we overcome the long-standing trade-off between practicality and theoretical rigor inherent in prior approaches (e.g., MixFlows). Experiments demonstrate that our method matches or surpasses NUTS and black-box normalizing flows in posterior approximation, Monte Carlo estimation, and normalizing constant estimation—unifying theoretical soundness with empirical effectiveness.

Technology Category

Application Category

📝 Abstract
Most expressive variational families -- such as normalizing flows -- lack practical convergence guarantees, as their theoretical assurances typically hold only at the intractable global optimum. In this work, we present a general recipe for constructing tuning-free, asymptotically exact variational flows from involutive MCMC kernels. The core methodological component is a novel representation of general involutive MCMC kernels as invertible, measure-preserving iterated random function systems, which act as the flow maps of our variational flows. This leads to three new variational families with provable total variation convergence. Our framework resolves key practical limitations of existing variational families with similar guarantees (e.g., MixFlows), while requiring substantially weaker theoretical assumptions. Finally, we demonstrate the competitive performance of our flows across tasks including posterior approximation, Monte Carlo estimates, and normalization constant estimation, outperforming or matching No-U-Turn sampler (NUTS) and black-box normalizing flows.
Problem

Research questions and friction points this paper is trying to address.

Lack of practical convergence guarantees in expressive variational families
Constructing asymptotically exact variational flows from MCMC kernels
Resolving limitations of existing variational families with weaker assumptions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Involutive MCMC kernels create variational flows
Invertible measure-preserving random function systems
Three new variational families with convergence
🔎 Similar Papers
No similar papers found.
Z
Zuheng Xu
Department of Statistics, University of British Columbia
Trevor Campbell
Trevor Campbell
Associate Professor, Statistics, UBC
Machine LearningStatisticsOptimizationMathematics