Annealing Flow Generative Models Towards Sampling High-Dimensional and Multi-Modal Distributions

📅 2024-09-30
📈 Citations: 4
Influential: 0
📄 PDF
🤖 AI Summary
Sampling from high-dimensional multimodal distributions remains a fundamental challenge in Bayesian inference and physics-informed machine learning. To address this, we propose Annealing Flow (AF), the first continuous normalizing flow (CNF) framework integrating annealing dynamics with an optimal transport (OT)-based training objective and Wasserstein distance regularization. This coupling explicitly guides inter-modal exploration and improves mode coverage. AF employs a temperature-scheduled annealing process for progressive distribution approximation, enhancing training stability and reducing reliance on Monte Carlo estimation. Extensive experiments on high-dimensional synthetic multimodal benchmarks and real-world datasets demonstrate that AF consistently outperforms state-of-the-art flow-based methods in sampling quality (e.g., effective sample size, maximum mean discrepancy), mode coverage, and training efficiency. Notably, AF successfully generates samples from adversarial distributions that are intractable for conventional samplers.

Technology Category

Application Category

📝 Abstract
Sampling from high-dimensional, multi-modal distributions remains a fundamental challenge across domains such as statistical Bayesian inference and physics-based machine learning. In this paper, we propose Annealing Flow (AF), a method built on Continuous Normalizing Flow (CNF) for sampling from high-dimensional and multi-modal distributions. AF is trained with a dynamic Optimal Transport (OT) objective incorporating Wasserstein regularization, and guided by annealing procedures, facilitating effective exploration of modes in high-dimensional spaces. Compared to recent NF methods, AF greatly improves training efficiency and stability, with minimal reliance on MC assistance. We demonstrate the superior performance of AF compared to state-of-the-art methods through experiments on various challenging distributions and real-world datasets, particularly in high-dimensional and multi-modal settings. We also highlight AF potential for sampling the least favorable distributions.
Problem

Research questions and friction points this paper is trying to address.

Sampling high-dimensional multi-modal distributions effectively
Improving training efficiency and stability in generative models
Exploring least favorable distributions with minimal MC assistance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Annealing Flow based on CNF
Dynamic Optimal Transport objective
Wasserstein regularization guidance
🔎 Similar Papers
No similar papers found.
D
Dongze Wu
H. Milton Stewart School of Industrial and Systems Engineering (ISyE), Georgia Institute of Technology, Atlanta, GA 30332, USA
Yao Xie
Yao Xie
Coca-Cola Foundation Chair and Professor, Georgia Institute of Technology
statisticsmachine learningoptimizationsignal processing