🤖 AI Summary
Traditional Langevin Monte Carlo (LMC) relies on the fluctuation–dissipation theorem, necessitating a trade-off between stochastic fluctuations and deterministic dissipation—introducing discretization bias and limiting sampling efficiency. This work proposes microcanonical Langevin Monte Carlo (MCLMC), the first continuous-time stochastic sampling framework that requires only the configuration-space marginal to follow the target canonical distribution, without enforcing the fluctuation–dissipation relation. We establish its ergodicity and unbiasedness theoretically; crucially, we prove that the drift and diffusion components each preserve the stationary distribution, thereby eliminating the dominant source of discretization bias. Under convex potentials, exponential convergence is rigorously guaranteed. Numerical experiments on the φ⁴ lattice model demonstrate substantial speedups over Hamiltonian Monte Carlo (HMC): 12× faster on an 8×8 lattice and 32× faster on a 64×64 lattice, with acceleration scaling favorably with system size.
📝 Abstract
Stochastic sampling algorithms such as Langevin Monte Carlo are inspired by physical systems in a heat bath. Their equilibrium distribution is the canonical ensemble given by a prescribed target distribution, so they must balance fluctuation and dissipation as dictated by the fluctuation-dissipation theorem. In contrast to the common belief, we show that the fluctuation-dissipation theorem is not required because only the configuration space distribution, and not the full phase space distribution, needs to be canonical. We propose a continuous-time Microcanonical Langevin Monte Carlo (MCLMC) as a dissipation-free system of stochastic differential equations (SDE). We derive the corresponding Fokker-Planck equation and show that the stationary distribution is the microcanonical ensemble with the desired canonical distribution on configuration space. We prove that MCLMC is ergodic for any nonzero amount of stochasticity, and for smooth, convex potentials, the expectation values converge exponentially fast. Furthermore, the deterministic drift and the stochastic diffusion separately preserve the stationary distribution. This uncommon property is attractive for practical implementations as it implies that the drift-diffusion discretization schemes are bias-free, so the only source of bias is the discretization of the deterministic dynamics. We applied MCLMC on a lattice $phi^4$ model, where Hamiltonian Monte Carlo (HMC) is currently the state-of-the-art integrator. For the same accuracy, MCLMC converges 12 times faster than HMC on an $8 imes8$ lattice. On a $64 imes64$ lattice, it is already 32 times faster. The trend is expected to persist to larger lattices, which are of particular interest, for example, in lattice quantum chromodynamics.