Regularization can make diffusion models more efficient

πŸ“… 2025-02-13
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Diffusion models suffer from high computational costs due to iterative, dense sampling, hindering practical deployment. This paper proposes a sparsity-regularized diffusion framework and provides the first theoretical proof that such regularization reduces the computational complexity of the diffusion processβ€”from scaling with the ambient input dimension to scaling with the intrinsic data dimension. Our method integrates high-dimensional statistical modeling, diffusion dynamical analysis, and rigorous error-bound derivation, and empirically optimizes sampling trajectories across multiple benchmarks. Experiments demonstrate that, while maintaining or even improving generation quality (as measured by FID and LPIPS), our approach significantly reduces the number of sampling steps (by 35–52% on average) and overall computational overhead. The core contribution lies in establishing a formal theoretical link between sparsity regularization and intrinsic-dimension-driven efficiency gains, thereby achieving synergistic optimization of both sample quality and inference speed.

Technology Category

Application Category

πŸ“ Abstract
Diffusion models are one of the key architectures of generative AI. Their main drawback, however, is the computational costs. This study indicates that the concept of sparsity, well known especially in statistics, can provide a pathway to more efficient diffusion pipelines. Our mathematical guarantees prove that sparsity can reduce the input dimension's influence on the computational complexity to that of a much smaller intrinsic dimension of the data. Our empirical findings confirm that inducing sparsity can indeed lead to better samples at a lower cost.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational costs in diffusion models
Applying sparsity for efficient diffusion pipelines
Lowering input dimension influence via sparsity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Regularization enhances diffusion models
Sparsity reduces computational complexity
Efficient samples via sparsity induction
πŸ”Ž Similar Papers
No similar papers found.