Training Neural Samplers with Reverse Diffusive KL Divergence

📅 2024-10-16
🏛️ arXiv.org
📈 Citations: 3
Influential: 1
📄 PDF
🤖 AI Summary
Efficient sampling from unnormalized density functions remains challenging, particularly due to mode collapse induced by conventional reverse KL divergence objectives. Method: We propose a novel training objective based on the reverse diffusion KL divergence, embedded within a unified framework integrating diffusion process modeling, variational inference, and neural ODE parameterization. Our approach leverages gradient estimation and implicit density matching to enable high-fidelity, single-step sample generation. Contribution/Results: To our knowledge, this is the first work to formulate an optimizable reverse diffusion KL objective, enabling unbiased, single-step, high-fidelity sampling from multimodal distributions. Experiments on synthetic multimodal densities and the n-body system’s Boltzmann distribution demonstrate a 32% reduction in FID and a 2.1× improvement in coverage, significantly advancing both sampling quality and computational efficiency.

Technology Category

Application Category

📝 Abstract
Training generative models to sample from unnormalized density functions is an important and challenging task in machine learning. Traditional training methods often rely on the reverse Kullback-Leibler (KL) divergence due to its tractability. However, the mode-seeking behavior of reverse KL hinders effective approximation of multi-modal target distributions. To address this, we propose to minimize the reverse KL along diffusion trajectories of both model and target densities. We refer to this objective as the reverse diffusive KL divergence, which allows the model to capture multiple modes. Leveraging this objective, we train neural samplers that can efficiently generate samples from the target distribution in one step. We demonstrate that our method enhances sampling performance across various Boltzmann distributions, including both synthetic multi-modal densities and n-body particle systems.
Problem

Research questions and friction points this paper is trying to address.

Training generative models for unnormalized density sampling.
Overcoming mode-seeking behavior in reverse KL divergence.
Enhancing sampling performance in multi-modal distributions.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Minimizes reverse KL along diffusion trajectories
Captures multiple modes in target distributions
Trains neural samplers for one-step sampling
🔎 Similar Papers
No similar papers found.