π€ AI Summary
This work addresses conditional distribution modeling by proposing Conditional FΓΆllmer Flow (CFF), an ODE-based method that transports samples from a standard Gaussian distribution via a learnable, conditional velocity field to directly approximate the target conditional distribution. Theoretically, we establish the first end-to-end convergence error bound for conditional ODE flows under the Wasserstein-2 distance. Methodologically, CFF introduces a novel nonparametric paradigm: a neural network directly parameterizes the conditional velocity field without requiring explicit density estimation or decoupled sampling procedures. Empirically, CFF achieves state-of-the-art performance on both nonparametric conditional density estimation and conditional image generation tasks, demonstrating superior accuracy, rigorous theoretical guarantees, and strong generalization across diverse domains.
π Abstract
We introduce an ordinary differential equation (ODE) based deep generative method for learning conditional distributions, named Conditional F""ollmer Flow. Starting from a standard Gaussian distribution, the proposed flow could approximate the target conditional distribution very well when the time is close to 1. For effective implementation, we discretize the flow with Euler's method where we estimate the velocity field nonparametrically using a deep neural network. Furthermore, we also establish the convergence result for the Wasserstein-2 distance between the distribution of the learned samples and the target conditional distribution, providing the first comprehensive end-to-end error analysis for conditional distribution learning via ODE flow. Our numerical experiments showcase its effectiveness across a range of scenarios, from standard nonparametric conditional density estimation problems to more intricate challenges involving image data, illustrating its superiority over various existing conditional density estimation methods.