FlowVAT: Normalizing Flow Variational Inference with Affine-Invariant Tempering

📅 2025-05-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Normalizing flow-based variational inference (VI) for multimodal, high-dimensional posteriors suffers from mode collapse and mode seeking, while conventional annealing methods rely heavily on hand-crafted temperature schedules and hyperparameter tuning. Method: We propose a conditional annealing variational inference framework that introduces an affine-invariant temperature-conditioning mechanism jointly governing both the base and target distributions; it employs a single flow model capable of generalizing across temperatures, thereby eliminating explicit annealing schedules and reducing hyperparameter sensitivity. The method integrates conditional variational autoencoders with temperature-conditioned neural networks. Results: On 2D/10D/20D multimodal benchmarks, our approach significantly improves the evidence lower bound (ELBO) and recovers true posterior modes more accurately—especially in high dimensions—outperforming both standard and adaptive annealing baselines. It enables fully automated, black-box VI without manual intervention.

Technology Category

Application Category

📝 Abstract
Multi-modal and high-dimensional posteriors present significant challenges for variational inference, causing mode-seeking behavior and collapse despite the theoretical expressiveness of normalizing flows. Traditional annealing methods require temperature schedules and hyperparameter tuning, falling short of the goal of truly black-box variational inference. We introduce FlowVAT, a conditional tempering approach for normalizing flow variational inference that addresses these limitations. Our method tempers both the base and target distributions simultaneously, maintaining affine-invariance under tempering. By conditioning the normalizing flow on temperature, we leverage overparameterized neural networks' generalization capabilities to train a single flow representing the posterior across a range of temperatures. This preserves modes identified at higher temperatures when sampling from the variational posterior at $T = 1$, mitigating standard variational methods' mode-seeking behavior. In experiments with 2, 10, and 20 dimensional multi-modal distributions, FlowVAT outperforms traditional and adaptive annealing methods, finding more modes and achieving better ELBO values, particularly in higher dimensions where existing approaches fail. Our method requires minimal hyperparameter tuning and does not require an annealing schedule, advancing toward fully-automatic black-box variational inference for complicated posteriors.
Problem

Research questions and friction points this paper is trying to address.

Addressing mode collapse in high-dimensional variational inference
Eliminating need for manual temperature scheduling in annealing
Enabling black-box inference for multi-modal posterior distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Conditional tempering for normalizing flow inference
Affine-invariant tempering of base and target distributions
Single flow trained across multiple temperatures
🔎 Similar Papers
No similar papers found.