🤖 AI Summary
This paper investigates how diffusion models automatically exploit unknown low-dimensional structures inherent in target distributions to accelerate sampling. Focusing on DDIM and DDPM—two canonical samplers—we analyze their convergence rates under exact score estimation via total variation distance.
Method: We combine probabilistic distance analysis, iterative complexity theory, and nonparametric distributional assumptions—without requiring smoothness or log-concavity.
Contribution/Results: We provide the first rigorous proof that DDIM-style samplers are adaptive to unknown low-dimensional manifolds. We establish tight upper and lower bounds, revealing that the sampling coefficients proposed by Ho et al. (2020) and Song et al. (2020) are (nearly) necessary for such adaptivity. Our analysis shows that both DDIM and DDPM achieve an iteration complexity of $O(k/varepsilon)$ (up to logarithmic factors), significantly improving upon prior total-variation convergence bounds for DDPM and extending applicability to broad classes of unstructured target distributions.
📝 Abstract
This paper investigates how diffusion generative models leverage (unknown) low-dimensional structure to accelerate sampling. Focusing on two mainstream samplers -- the denoising diffusion implicit model (DDIM) and the denoising diffusion probabilistic model (DDPM) -- and assuming accurate score estimates, we prove that their iteration complexities are no greater than the order of $k/varepsilon$ (up to some log factor), where $varepsilon$ is the precision in total variation distance and $k$ is some intrinsic dimension of the target distribution. Our results are applicable to a broad family of target distributions without requiring smoothness or log-concavity assumptions. Further, we develop a lower bound that suggests the (near) necessity of the coefficients introduced by Ho et al.(2020) and Song et al.(2020) in facilitating low-dimensional adaptation. Our findings provide the first rigorous evidence for the adaptivity of the DDIM-type samplers to unknown low-dimensional structure, and improve over the state-of-the-art DDPM theory regarding total variation convergence.