π€ AI Summary
This work addresses the challenges of mode collapse and training instability commonly encountered in sampling from high-dimensional, complex unnormalized densities. The authors propose a generalized fixed-point diffusion matching framework that unifies the diffusion process as a learnable stochastic transport map grounded in Nelsonβs relations. By incorporating damping iterations and regularization mechanisms, the method establishes a single-objective, scalable optimization strategy that eliminates restrictive assumptions on prior distributions. Experimental results on both synthetic densities and high-dimensional molecular data demonstrate that the approach significantly enhances sampling stability and diversity, achieving state-of-the-art performance. Moreover, it enables unprecedented scalability while effectively preserving multimodal structures inherent in the target distributions.
π Abstract
Sampling from unnormalized densities using diffusion models has emerged as a powerful paradigm. However, while recent approaches that use least-squares `matching' objectives have improved scalability, they often necessitate significant trade-offs, such as restricting prior distributions or relying on unstable optimization schemes. By generalizing these methods as special forms of fixed-point iterations rooted in Nelson's relation, we develop a new method that addresses these limitations, called Bridge Matching Sampler (BMS). Our approach enables learning a stochastic transport map between arbitrary prior and target distributions with a single, scalable, and stable objective. Furthermore, we introduce a damped variant of this iteration that incorporates a regularization term to mitigate mode collapse and further stabilize training. Empirically, we demonstrate that our method enables sampling at unprecedented scales while preserving mode diversity, achieving state-of-the-art results on complex synthetic densities and high-dimensional molecular benchmarks.