The Performance Of The Unadjusted Langevin Algorithm Without Smoothness Assumptions

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses efficient sampling from nonsmooth and non-log-concave target distributions. We propose a Langevin algorithm that operates directly on the original (potentially nondifferentiable) density—requiring neither smoothing preprocessing nor hyperparameter tuning. For the first time, we establish a non-asymptotic Wasserstein-2 convergence theory for the Unadjusted Langevin Algorithm (ULA) under *no* smoothness or log-concavity assumptions, bypassing computationally expensive techniques such as the Moreau–Yosida envelope or Gaussian smoothing. The algorithm achieves an $O(1/sqrt{k})$ convergence rate in Wasserstein-2 distance after $k$ iterations. We further derive an explicit generalization error bound for excess risk minimization. Our theoretical framework accommodates ill-conditioned and irregular target distributions, substantially broadening both the applicability and practical utility of Langevin-based sampling methods.

Technology Category

Application Category

📝 Abstract
In this article, we study the problem of sampling from distributions whose densities are not necessarily smooth nor log-concave. We propose a simple Langevin-based algorithm that does not rely on popular but computationally challenging techniques, such as the Moreau Yosida envelope or Gaussian smoothing. We derive non-asymptotic guarantees for the convergence of the algorithm to the target distribution in Wasserstein distances. Non asymptotic bounds are also provided for the performance of the algorithm as an optimizer, specifically for the solution of associated excess risk optimization problems.
Problem

Research questions and friction points this paper is trying to address.

Sampling from non-smooth, non-log-concave distributions efficiently
Proposing a simple Langevin algorithm without complex techniques
Providing non-asymptotic convergence guarantees in Wasserstein distances
Innovation

Methods, ideas, or system contributions that make the work stand out.

Langevin-based algorithm without smoothness assumptions
Convergence guarantees in Wasserstein distances
Non-asymptotic bounds for excess risk optimization
🔎 Similar Papers
No similar papers found.
T
Tim Johnston
Université Paris Dauphine-PSL, Ceremade, France
Iosif Lytras
Iosif Lytras
Postdoctoral Researcher Archimedes Research Centre for Data and Algorithms
Numerics of SDEsLangevin-based SamplingDiffusion modelsOptimization
N
Nikolaos Makras
School of Mathematics, University of Edinburgh, UK
S
S. Sabanis
School of Mathematics, University of Edinburgh, UK; National Technical University of Athens, Greece; Archimedes/Athena Research Centre, Greece