๐ค AI Summary
This work establishes non-asymptotic convergence guarantees for diffusion models based on probability flow ordinary differential equations (ODEs) under the 2-Wasserstein distance. We consider exact score estimation and smooth log-concave data distributions. Methodologically, we introduce a novel synthesis of synchronous coupling, Wasserstein distance analysis, continuous-time contraction rate characterization, and joint bounds on discretization and score-matching errorsโovercoming theoretical challenges arising from the interplay of non-autonomous ODEs and exponential integrators. Our analysis yields explicit iteration complexity upper bounds for several classes of ODE-based samplers. To our knowledge, this is the first quantifiable, non-asymptotic convergence theory for score-based generative models, providing rigorous foundational guarantees for probability flow sampling.
๐ Abstract
Score-based generative modeling with probability flow ordinary differential equations (ODEs) has achieved remarkable success in a variety of applications. While various fast ODE-based samplers have been proposed in the literature and employed in practice, the theoretical understandings about convergence properties of the probability flow ODE are still quite limited. In this paper, we provide the first non-asymptotic convergence analysis for a general class of probability flow ODE samplers in 2-Wasserstein distance, assuming accurate score estimates and smooth log-concave data distributions. We then consider various examples and establish results on the iteration complexity of the corresponding ODE-based samplers. Our proof technique relies on spelling out explicitly the contraction rate for the continuous-time ODE and analyzing the discretization and score-matching errors using synchronous coupling; the challenge in our analysis mainly arises from the inherent non-autonomy of the probability flow ODE and the specific exponential integrator that we study.