Proximal Diffusion Neural Sampler

📅 2025-10-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address mode collapse in sampling from multimodal unnormalized target distributions—caused by high energy barriers—this paper proposes a diffusion-based neural sampling framework grounded in proximal point methods on the space of probability paths. The method decomposes complex sampling learning into a sequence of progressive proximal subproblems, substantially enhancing cross-modal exploration. Furthermore, we introduce a proximal weighted denoising cross-entropy (WDCE) objective, integrating diffusion modeling with stochastic optimal control theory to ensure robust training. Experiments demonstrate that our approach outperforms existing baselines on both continuous and discrete sampling tasks. Notably, in strongly multimodal domains such as molecular dynamics and statistical physics, it achieves significant improvements in distribution coverage and sampling stability.

Technology Category

Application Category

📝 Abstract
The task of learning a diffusion-based neural sampler for drawing samples from an unnormalized target distribution can be viewed as a stochastic optimal control problem on path measures. However, the training of neural samplers can be challenging when the target distribution is multimodal with significant barriers separating the modes, potentially leading to mode collapse. We propose a framework named extbf{Proximal Diffusion Neural Sampler (PDNS)} that addresses these challenges by tackling the stochastic optimal control problem via proximal point method on the space of path measures. PDNS decomposes the learning process into a series of simpler subproblems that create a path gradually approaching the desired distribution. This staged procedure traces a progressively refined path to the desired distribution and promotes thorough exploration across modes. For a practical and efficient realization, we instantiate each proximal step with a proximal weighted denoising cross-entropy (WDCE) objective. We demonstrate the effectiveness and robustness of PDNS through extensive experiments on both continuous and discrete sampling tasks, including challenging scenarios in molecular dynamics and statistical physics.
Problem

Research questions and friction points this paper is trying to address.

Learning diffusion neural samplers for multimodal distributions with mode collapse
Solving stochastic optimal control via proximal methods on path measures
Enabling thorough mode exploration in continuous and discrete sampling tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proximal point method optimizes path measures
Staged subproblems gradually approach target distribution
Proximal weighted denoising cross-entropy enables efficient realization
🔎 Similar Papers
No similar papers found.