Improving Flow Matching by Aligning Flow Divergence

📅 2026-01-31
🏛️ International Conference on Machine Learning
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that conditional flow matching (CFM) struggles to accurately recover the true data distribution dynamics when modeling probability paths. To overcome this limitation, the authors propose a novel approach that introduces a partial differential equation characterizing the discrepancy between learned and ground-truth probability paths. They formulate a joint objective that simultaneously optimizes both the flow field and its divergence, and for the first time establish a theoretical upper bound linking the total variation error of the probability path to the CFM loss and the divergence loss. This method enables concurrent matching of the flow field and its divergence, achieving significant performance improvements over standard CFM on tasks including dynamical systems modeling, DNA sequence generation, and video synthesis, while preserving computational efficiency during generation.

Technology Category

Application Category

📝 Abstract
Conditional flow matching (CFM) stands out as an efficient, simulation-free approach for training flow-based generative models, achieving remarkable performance for data generation. However, CFM is insufficient to ensure accuracy in learning probability paths. In this paper, we introduce a new partial differential equation characterization for the error between the learned and exact probability paths, along with its solution. We show that the total variation gap between the two probability paths is bounded above by a combination of the CFM loss and an associated divergence loss. This theoretical insight leads to the design of a new objective function that simultaneously matches the flow and its divergence. Our new approach improves the performance of the flow-based generative model by a noticeable margin without sacrificing generation efficiency. We showcase the advantages of this enhanced training approach over CFM on several important benchmark tasks, including generative modeling for dynamical systems, DNA sequences, and videos. Code is available at \href{https://github.com/Utah-Math-Data-Science/Flow_Div_Matching}{Utah-Math-Data-Science}.
Problem

Research questions and friction points this paper is trying to address.

conditional flow matching
probability path accuracy
flow-based generative models
flow divergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

flow matching
divergence alignment
generative modeling
probability path
partial differential equation
🔎 Similar Papers
No similar papers found.
Yuhao Huang
Yuhao Huang
Shenzhen University
Medical Image ComputingUltrasoundModel Robustness
T
Taos Transue
Department of Mathematics, University of Utah, Salt Lake City, UT, USA; Scientific Computing and Imaging (SCI) Institute, Salt Lake City, UT, USA
Shih-Hsin Wang
Shih-Hsin Wang
Ph. D., University of Utah
Geometric Deep LearningGenerative ModelsAlgebraic geometry
W
William Feldman
Department of Mathematics, University of Utah, Salt Lake City, UT, USA
Hong Zhang
Hong Zhang
Argonne National Laboratory
scientific machine learningsensitivity analysishigh performance computingPETSc
Bao Wang
Bao Wang
Unknown affiliation