Unleashing High-Quality Image Generation in Diffusion Sampling Using Second-Order Levenberg-Marquardt-Langevin

📅 2025-05-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing diffusion model (DM) sampling methods—e.g., Langevin dynamics—leverage only first-order geometric information, limiting their ability to faithfully capture complex data manifolds in high dimensions; while incorporating second-order Hessian geometry improves sample quality, direct Hessian computation incurs quadratic complexity and is thus computationally intractable at scale. Method: We propose Levenberg-Marquardt-Langevin (LML), a training-free sampler that efficiently integrates stable second-order geometry into diffusion sampling. LML employs a low-rank Hessian approximation to reduce computational complexity and introduces Levenberg-Marquardt–style damping to ensure numerical stability. It operates as a plug-and-play module for any pre-trained DM without fine-tuning. Contribution/Results: Experiments across multiple benchmarks demonstrate significant improvements in generation quality—measured by reduced FID and LPIPS scores—with negligible computational overhead. LML constitutes the first practical, scalable second-order geometric enhancement for high-dimensional diffusion sampling.

Technology Category

Application Category

📝 Abstract
The diffusion models (DMs) have demonstrated the remarkable capability of generating images via learning the noised score function of data distribution. Current DM sampling techniques typically rely on first-order Langevin dynamics at each noise level, with efforts concentrated on refining inter-level denoising strategies. While leveraging additional second-order Hessian geometry to enhance the sampling quality of Langevin is a common practice in Markov chain Monte Carlo (MCMC), the naive attempts to utilize Hessian geometry in high-dimensional DMs lead to quadratic-complexity computational costs, rendering them non-scalable. In this work, we introduce a novel Levenberg-Marquardt-Langevin (LML) method that approximates the diffusion Hessian geometry in a training-free manner, drawing inspiration from the celebrated Levenberg-Marquardt optimization algorithm. Our approach introduces two key innovations: (1) A low-rank approximation of the diffusion Hessian, leveraging the DMs' inherent structure and circumventing explicit quadratic-complexity computations; (2) A damping mechanism to stabilize the approximated Hessian. This LML approximated Hessian geometry enables the diffusion sampling to execute more accurate steps and improve the image generation quality. We further conduct a theoretical analysis to substantiate the approximation error bound of low-rank approximation and the convergence property of the damping mechanism. Extensive experiments across multiple pretrained DMs validate that the LML method significantly improves image generation quality, with negligible computational overhead.
Problem

Research questions and friction points this paper is trying to address.

Enhancing image generation quality in diffusion models
Reducing computational cost of second-order Hessian geometry
Stabilizing Hessian approximation for accurate diffusion sampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Second-order Levenberg-Marquardt-Langevin method
Low-rank Hessian approximation
Damping mechanism stabilizes Hessian
🔎 Similar Papers
No similar papers found.