Wasserstein Convergence of Critically Damped Langevin Diffusions

📅 2025-11-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of bounding the sampling error—measured in Wasserstein distance—for generative models based on critically damped Langevin diffusion (CLD). To this end, we propose a generalized dynamical model that introduces tunable noise hyperparameters in an extended state space, enabling controlled perturbation of data coordinates and thereby enhancing trajectory smoothness and sampling efficiency. Theoretically, we derive, for the first time, an explicit Wasserstein error bound under this framework, characterizing how hyperparameters govern convergence rate and sample quality. Methodologically, our approach unifies perspectives from statistical mechanics and Hamiltonian dynamics, integrating score matching with diffusion modeling to achieve superior generative performance. Empirical evaluation demonstrates that the proposed method consistently outperforms standard diffusion models in both error control and sample fidelity.

Technology Category

Application Category

📝 Abstract
Score-based Generative Models (SGMs) have achieved impressive performance in data generation across a wide range of applications and benefit from strong theoretical guarantees. Recently, methods inspired by statistical mechanics, in particular, Hamiltonian dynamics, have introduced Critically-damped Langevin Diffusions (CLDs), which define diffusion processes on extended spaces by coupling the data with auxiliary variables. These approaches, along with their associated score-matching and sampling procedures, have been shown to outperform standard diffusion-based samplers numerically. In this paper, we analyze a generalized dynamic that extends classical CLDs by introducing an additional hyperparameter controlling the noise applied to the data coordinate, thereby better exploiting the extended space. We further derive a novel upper bound on the sampling error of CLD-based generative models in the Wasserstein metric. This additional hyperparameter influences the smoothness of sample paths, and our discretization error analysis provides practical guidance for its tuning, leading to improved sampling performance.
Problem

Research questions and friction points this paper is trying to address.

Analyzing generalized critically-damped Langevin diffusions with extended hyperparameter control
Deriving Wasserstein error bounds for CLD-based generative model sampling
Providing discretization guidance for hyperparameter tuning to improve performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extended CLD dynamics with noise hyperparameter
Wasserstein error bound for CLD sampling
Discretization analysis guides hyperparameter tuning
🔎 Similar Papers
No similar papers found.
S
Stanislas Strasman
Sorbonne Université and Université Paris Cité, CNRS, LPSM, F-75005 Paris, France
Sobihan Surendran
Sobihan Surendran
PhD Student, Sorbonne Université
Stochastic OptimizationGenerative Models
Claire Boyer
Claire Boyer
Université Paris-Saclay
Sylvain Le Corff
Sylvain Le Corff
LPSM, Sorbonne Université
Monte Carlo methodsMarkov ChainsComputational StatisticsNonparameteric Statistics
V
Vincent Lemaire
Sorbonne Université and Université Paris Cité, CNRS, LPSM, F-75005 Paris, France
Antonio Ocello
Antonio Ocello
Ecole Polytechnique