🤖 AI Summary
To address the slow convergence and poor adaptability of preconditioning strategies in traditional Langevin Monte Carlo (LMC) for sampling from high-dimensional ill-conditioned distributions, this paper proposes Subspace Langevin Monte Carlo (SLMC). SLMC introduces the subspace descent paradigm—previously developed in Euclidean space—into the Wasserstein space, accelerating preconditioned Langevin dynamics via low-dimensional subspace projections. Theoretically, we establish a convergence analysis framework based on the relative condition number, eliminating reliance on global strong convexity and smoothness assumptions, and prove that the linear convergence rate is governed by the relative condition number within the subspace. Empirically, on ill-conditioned Gaussian targets, SLMC achieves 3–5× higher sampling efficiency than standard LMC and stochastic gradient Langevin dynamics (SGLD), significantly improving robustness and scalability for high-dimensional anisotropic distributions.
📝 Abstract
We develop a new efficient method for high-dimensional sampling called Subspace Langevin Monte Carlo. The primary application of these methods is to efficiently implement Preconditioned Langevin Monte Carlo. To demonstrate the usefulness of this new method, we extend ideas from subspace descent methods in Euclidean space to solving a specific optimization problem over Wasserstein space. Our theoretical analysis demonstrates the advantageous convergence regimes of the proposed method, which depend on relative conditioning assumptions common to mirror descent methods. We back up our theory with experimental evidence on sampling from an ill-conditioned Gaussian distribution.