🤖 AI Summary
Langevin Monte Carlo (LMC) suffers from slow convergence in high-dimensional, large-scale sampling, with strong dependence on dimension $d$ and accuracy $varepsilon$.
Method: We propose the first high-order LMC algorithm applicable to arbitrary order $P geq 3$, integrating high-order Langevin dynamics, operator splitting, and exact numerical integration, analyzed rigorously under the Wasserstein distance.
Contribution/Results: We establish the first mixing-time upper bound of $O(d^{1/R} / varepsilon^{1/(2R)})$, where $R$ increases with $P$, substantially weakening the dependence on both $d$ and $varepsilon$. Experiments demonstrate superior convergence rates and sampling efficiency over state-of-the-art methods under high-dimensional, strongly convex, smooth potentials. This work provides a novel computational tool for high-dimensional Bayesian inference and large-scale data science.
📝 Abstract
Langevin algorithms are popular Markov chain Monte Carlo (MCMC) methods for large-scale sampling problems that often arise in data science. We propose Monte Carlo algorithms based on the discretizations of $P$-th order Langevin dynamics for any $Pgeq 3$. Our design of $P$-th order Langevin Monte Carlo (LMC) algorithms is by combining splitting and accurate integration methods. We obtain Wasserstein convergence guarantees for sampling from distributions with log-concave and smooth densities. Specifically, the mixing time of the $P$-th order LMC algorithm scales as $Oleft(d^{frac{1}{R}}/ε^{frac{1}{2R}}
ight)$ for $R=4cdot 1_{{ P=3}}+ (2P-1)cdot 1_{{ Pgeq 4}}$, which has a better dependence on the dimension $d$ and the accuracy level $ε$ as $P$ grows. Numerical experiments illustrate the efficiency of our proposed algorithms.