From Navigation to Refinement: Revealing the Two-Stage Nature of Flow-based Diffusion Models through Oracle Velocity

📅 2025-12-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Flow-based diffusion models—specifically flow matching (FM)—exhibit poorly understood training dynamics, hindering principled design and optimization. Method: We identify and rigorously characterize a two-phase “navigation–refinement” training dynamic: an early phase dominated by data-mode mixing for global structural generalization, followed by a late phase emphasizing memorization of recent samples for local detail optimization. Leveraging the closed-form solution of the marginal velocity field, we derive the exact Oracle FM objective and integrate theoretical analysis with empirical validation. Contribution/Results: This work is the first to formally establish and verify this two-stage mechanism. It explains the intrinsic efficacy of key techniques—including time-step shifting and classifier-free guidance—by linking them to phase-specific behavioral shifts. Our analysis clarifies the model’s evolution from generalization to memorization, offering a novel conceptual framework for understanding training dynamics. Moreover, it yields interpretable, actionable principles for architecture design and algorithmic improvement in flow-based generative modeling.

Technology Category

Application Category

📝 Abstract
Flow-based diffusion models have emerged as a leading paradigm for training generative models across images and videos. However, their memorization-generalization behavior remains poorly understood. In this work, we revisit the flow matching (FM) objective and study its marginal velocity field, which admits a closed-form expression, allowing exact computation of the oracle FM target. Analyzing this oracle velocity field reveals that flow-based diffusion models inherently formulate a two-stage training target: an early stage guided by a mixture of data modes, and a later stage dominated by the nearest data sample. The two-stage objective leads to distinct learning behaviors: the early navigation stage generalizes across data modes to form global layouts, whereas the later refinement stage increasingly memorizes fine-grained details. Leveraging these insights, we explain the effectiveness of practical techniques such as timestep-shifted schedules, classifier-free guidance intervals, and latent space design choices. Our study deepens the understanding of diffusion model training dynamics and offers principles for guiding future architectural and algorithmic improvements.
Problem

Research questions and friction points this paper is trying to address.

Understanding memorization-generalization behavior in flow-based diffusion models
Analyzing two-stage training target of navigation and refinement phases
Explaining effectiveness of practical techniques like timestep-shifted schedules
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-stage training target navigation refinement
Oracle velocity field closed-form expression analysis
Timestep-shifted schedules classifier-free guidance optimization
🔎 Similar Papers
No similar papers found.
Haoming Liu
Haoming Liu
National University of SIngapore
Economics
J
Jinnuo Liu
Center for Data Science, New York University Shanghai
Yanhao Li
Yanhao Li
Centre Borelli, ENS Paris-Saclay, Université Paris-Saclay
Computer VisionImage ProcessingDeepfake Detection
L
Liuyang Bai
Center for Data Science, New York University Shanghai
Y
Yunkai Ji
Center for Data Science, New York University Shanghai
Y
Yuanhe Guo
Center for Data Science, New York University Shanghai
S
Shenji Wan
Center for Data Science, New York University Shanghai
H
Hongyi Wen
Center for Data Science, New York University Shanghai