Sampling non-log-concave densities via Hessian-free high-resolution dynamics

📅 2026-01-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of theoretical guarantees for sampling from non-log-concave target distributions induced by non-convex potential functions. To this end, the authors propose a sampling method based on Hessian-free high-resolution (HFHR) dynamics. By employing reflection/synchronous coupling and a Lyapunov-weighted Wasserstein distance, they establish the first exponential convergence guarantee for HFHR in the non-convex setting, proving that under asymptotically linear growth conditions, its convergence rate strictly outperforms that of kinetic Langevin dynamics (KLD). Empirical evaluations on multi-well potentials, Bayesian regression with L^p regularization, binary classification, and real-data logistic regression with neural network preconditioning consistently demonstrate the accelerated convergence and superior performance of the proposed HFHR sampler.

Technology Category

Application Category

📝 Abstract
We study the problem of sampling from a target distribution $\pi(q)\propto e^{-U(q)}$ on $\mathbb{R}^d$, where $U$ can be non-convex, via the Hessian-free high-resolution (HFHR) dynamics, which is a second-order Langevin-type process that has $e^{-U(q)-\frac12|p|^2}$ as its unique invariant distribution, and it reduces to kinetic Langevin dynamics (KLD) as the resolution parameter $\alpha\to0$. The existing theory for HFHR dynamics in the literature is restricted to strongly-convex $U$, although numerical experiments are promising for non-convex settings as well. We focus on studying the convergence of HFHR dynamics when $U$ can be non-convex, which bridges a gap between theory and practice. Under a standard assumption of dissipativity and smoothness on $U$, we adopt the reflection/synchronous coupling method. This yields a Lyapunov-weighted Wasserstein distance in which the HFHR semigroup is exponentially contractive for all sufficiently small $\alpha>0$ whenever KLD is. We further show that, under an additional assumption that asymptotically $\nabla U$ has linear growth at infinity, the contraction rate for HFHR dynamics is strictly better than that of KLD, with an explicit gain. As a case study, we verify the assumptions and the resulting acceleration for three examples: a multi-well potential, Bayesian linear regression with $L^p$ regularizer and Bayesian binary classification. We conduct numerical experiments based on these examples, as well as an additional example of Bayesian logistic regression with real data processed by the neural networks, which illustrates the efficiency of the algorithms based on HFHR dynamics and verifies the acceleration and superior performance compared to KLD.
Problem

Research questions and friction points this paper is trying to address.

non-convex sampling
Hessian-free dynamics
convergence analysis
Langevin dynamics
non-log-concave distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hessian-free high-resolution dynamics
non-convex sampling
exponential contractivity
Lyapunov-weighted Wasserstein distance
accelerated convergence
🔎 Similar Papers
No similar papers found.