🤖 AI Summary
Addressing the core challenge of efficient i.i.d. sampling from Boltzmann distributions—critical in molecular dynamics and related fields—this paper proposes a neural diffusion sampler that requires only the energy function and no pre-collected target samples. Methodologically, it introduces the first integration of Noise-Contrastive Energy Matching (NEM) with a bootstrapping mechanism, establishing a diffusion framework that jointly optimizes bias and variance; the potential energy gradient is parameterized by a neural network to enable energy-guided learning. Theoretically, this design provably reduces estimation variance, enhancing sampling robustness and generalization. Empirically, the method achieves state-of-the-art performance on 40 Gaussian mixture distributions and a 4-particle double-well potential task: generated samples exhibit stronger statistical independence, more stable convergence, and training proceeds without reliance on any pre-acquired samples.
📝 Abstract
Developing an efficient sampler capable of generating independent and identically distributed (IID) samples from a Boltzmann distribution is a crucial challenge in scientific research, e.g. molecular dynamics. In this work, we intend to learn neural samplers given energy functions instead of data sampled from the Boltzmann distribution. By learning the energies of the noised data, we propose a diffusion-based sampler, Noised Energy Matching, which theoretically has lower variance and more complexity compared to related works. Furthermore, a novel bootstrapping technique is applied to NEM to balance between bias and variance. We evaluate NEM and BNEM on a 2-dimensional 40 Gaussian Mixture Model (GMM) and a 4-particle double-well potential (DW-4). The experimental results demonstrate that BNEM can achieve state-of-the-art performance while being more robust.