From stability of Langevin diffusion to convergence of proximal MCMC for non-log-concave sampling

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of probabilistic sampling under nonconvex and nonsmooth potentials—a setting prevalent in imaging inverse problems and other non-log-concave posterior distributions. We propose and theoretically analyze the Proximal Stochastic Gradient Langevin Algorithm (PSGLA). To our knowledge, this is the first rigorous convergence analysis of PSGLA under nonconvex potentials, establishing a novel theoretical link between the discrete stability of the Unadjusted Langevin Algorithm (ULA) and the smoothing properties of the Moreau envelope. PSGLA integrates forward-backward splitting, stochastic gradient estimation, and Moreau envelope analysis, ensuring both theoretical soundness and computational tractability. Experiments on image reconstruction demonstrate that PSGLA converges faster and achieves higher reconstruction accuracy than SGLD. Our key contributions are: (i) the first convergence guarantee for Langevin-based sampling under nonconvex, nonsmooth potentials; and (ii) uncovering an intrinsic connection between implicit regularization and sampling stability.

Technology Category

Application Category

📝 Abstract
We consider the problem of sampling distributions stemming from non-convex potentials with Unadjusted Langevin Algorithm (ULA). We prove the stability of the discrete-time ULA to drift approximations under the assumption that the potential is strongly convex at infinity. In many context, e.g. imaging inverse problems, potentials are non-convex and non-smooth. Proximal Stochastic Gradient Langevin Algorithm (PSGLA) is a popular algorithm to handle such potentials. It combines the forward-backward optimization algorithm with a ULA step. Our main stability result combined with properties of the Moreau envelope allows us to derive the first proof of convergence of the PSGLA for non-convex potentials. We empirically validate our methodology on synthetic data and in the context of imaging inverse problems. In particular, we observe that PSGLA exhibits faster convergence rates than Stochastic Gradient Langevin Algorithm for posterior sampling while preserving its restoration properties.
Problem

Research questions and friction points this paper is trying to address.

Sampling non-convex potentials using Unadjusted Langevin Algorithm
Proving stability of ULA under drift approximations
Convergence analysis of Proximal Stochastic Gradient Langevin Algorithm
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unadjusted Langevin Algorithm for non-convex sampling
Proximal Stochastic Gradient Langevin Algorithm convergence proof
Moreau envelope enables non-convex potential stability