The Poisson Midpoint Method for Langevin Dynamics: Provably Efficient Discretization for Diffusion Models

📅 2024-05-27
🏛️ Neural Information Processing Systems
📈 Citations: 5
Influential: 0
📄 PDF
🤖 AI Summary
Langevin Monte Carlo (LMC) suffers from inefficiency in diffusion models due to the Euler–Maruyama discretization, requiring thousands of small steps to ensure sampling fidelity. Existing stochastic midpoint methods are restricted to strongly log-concave, time-invariant target distributions and thus fail to accommodate the non-log-concave densities and time-varying drifts inherent in diffusion models. To address this, we propose the Poisson Midpoint method—the first stochastic midpoint scheme applicable to non-log-concave, time-dependent Langevin dynamics. It constructs a randomized midpoint approximation via a Poisson process, enables rigorous discrete-time error analysis, and integrates seamlessly into the reverse sampling framework. Theoretically, it achieves quadratic acceleration in convergence rate, substantially alleviating the step-count bottleneck. Empirically, on DDPM, it attains the sampling quality of the original 1000-step LMC using only 50–80 neural network evaluations—outperforming ODE-based samplers under comparable computational budgets.

Technology Category

Application Category

📝 Abstract
Langevin Dynamics is a Stochastic Differential Equation (SDE) central to sampling and generative modeling and is implemented via time discretization. Langevin Monte Carlo (LMC), based on the Euler-Maruyama discretization, is the simplest and most studied algorithm. LMC can suffer from slow convergence - requiring a large number of steps of small step-size to obtain good quality samples. This becomes stark in the case of diffusion models where a large number of steps gives the best samples, but the quality degrades rapidly with smaller number of steps. Randomized Midpoint Method has been recently proposed as a better discretization of Langevin dynamics for sampling from strongly log-concave distributions. However, important applications such as diffusion models involve non-log concave densities and contain time varying drift. We propose its variant, the Poisson Midpoint Method, which approximates a small step-size LMC with large step-sizes. We prove that this can obtain a quadratic speed up of LMC under very weak assumptions. We apply our method to diffusion models for image generation and show that it maintains the quality of DDPM with 1000 neural network calls with just 50-80 neural network calls and outperforms ODE based methods with similar compute.
Problem

Research questions and friction points this paper is trying to address.

Improves Langevin dynamics discretization for efficient diffusion model sampling
Addresses slow convergence of standard methods requiring many small steps
Enables high-quality image generation with significantly fewer neural network calls
Innovation

Methods, ideas, or system contributions that make the work stand out.

Poisson Midpoint Method replaces Euler-Maruyama discretization
Approximates small step-size LMC with large step-sizes
Achieves quadratic speedup with fewer neural network calls
🔎 Similar Papers
No similar papers found.
S
S. Kandasamy
Department of Computer Science, Cornell University
D
Dheeraj M. Nagaraj
Google DeepMind