Deep conditional distribution learning via conditional F""ollmer flow

πŸ“… 2024-02-02
πŸ“ˆ Citations: 1
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses conditional distribution modeling by proposing Conditional FΓΆllmer Flow (CFF), an ODE-based method that transports samples from a standard Gaussian distribution via a learnable, conditional velocity field to directly approximate the target conditional distribution. Theoretically, we establish the first end-to-end convergence error bound for conditional ODE flows under the Wasserstein-2 distance. Methodologically, CFF introduces a novel nonparametric paradigm: a neural network directly parameterizes the conditional velocity field without requiring explicit density estimation or decoupled sampling procedures. Empirically, CFF achieves state-of-the-art performance on both nonparametric conditional density estimation and conditional image generation tasks, demonstrating superior accuracy, rigorous theoretical guarantees, and strong generalization across diverse domains.

Technology Category

Application Category

πŸ“ Abstract
We introduce an ordinary differential equation (ODE) based deep generative method for learning conditional distributions, named Conditional F""ollmer Flow. Starting from a standard Gaussian distribution, the proposed flow could approximate the target conditional distribution very well when the time is close to 1. For effective implementation, we discretize the flow with Euler's method where we estimate the velocity field nonparametrically using a deep neural network. Furthermore, we also establish the convergence result for the Wasserstein-2 distance between the distribution of the learned samples and the target conditional distribution, providing the first comprehensive end-to-end error analysis for conditional distribution learning via ODE flow. Our numerical experiments showcase its effectiveness across a range of scenarios, from standard nonparametric conditional density estimation problems to more intricate challenges involving image data, illustrating its superiority over various existing conditional density estimation methods.
Problem

Research questions and friction points this paper is trying to address.

Learning conditional distributions via ODE-based generative models
Establishing convergence guarantees for Wasserstein-2 distance
Outperforming existing methods in conditional density estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses ODE-based deep generative method
Estimates velocity field with neural network
Establishes convergence for Wasserstein-2 distance
πŸ”Ž Similar Papers
J
Jinyuan Chang
School of Statistics, Southwestern University of Finance and Economics
Z
Zhao Ding
School of Mathematics and Statistics, Wuhan University
Y
Yuling Jiao
School of Mathematics and Statistics Hubei Key Laboratory of Computational Science, Wuhan University
Ruoxuan Li
Ruoxuan Li
Columbia University
computational cognitive sciencecomputational social science
J
Jerry Zhijian Yang
School of Mathematics and Statistics and Hubei Key Laboratory of Computational Science, Wuhan University