🤖 AI Summary
This paper addresses efficient sampling from target distributions constrained to convex domains. We propose Skew-Reflected Non-reversible Langevin Dynamics (SRNLD) and its discrete counterpart, Skew-Reflected Non-reversible Langevin Monte Carlo (SRNLMC), the first framework integrating skew reflection at boundaries with non-reversible stochastic flows—thereby overcoming the convergence rate limitations of conventional reversible Langevin methods. Theoretically, we establish improved non-asymptotic convergence bounds for SRNLD in both total variation and 1-Wasserstein distances; SRNLMC is shown to admit controllable discretization error. Empirically, SRNLMC significantly outperforms baseline methods—including projected Langevin—on both synthetic and real-world datasets, achieving superior trade-offs between sampling efficiency and accuracy.
📝 Abstract
We consider the constrained sampling problem where the goal is to sample from a target distribution on a constrained domain. We propose skew-reflected non-reversible Langevin dynamics (SRNLD), a continuous-time stochastic differential equation with skew-reflected boundary. We obtain non-asymptotic convergence rate of SRNLD to the target distribution in both total variation and 1-Wasserstein distances. By breaking reversibility, we show that the convergence is faster than the special case of the reversible dynamics. Based on the discretization of SRNLD, we propose skew-reflected non-reversible Langevin Monte Carlo (SRNLMC), and obtain non-asymptotic discretization error from SRNLD, and convergence guarantees to the target distribution in 1-Wasserstein distance. We show better performance guarantees than the projected Langevin Monte Carlo in the literature that is based on the reversible dynamics. Numerical experiments are provided for both synthetic and real datasets to show efficiency of the proposed algorithms.