Trajectory Consistency for One-Step Generation on Euler Mean Flows

📅 2026-01-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing single-step generation methods struggle with optimization difficulties and high computational costs when modeling long-horizon trajectory consistency. This work proposes Euler Mean Flow (EMF), which, to the best of our knowledge, is the first to introduce semigroup theory into flow models, replacing complex trajectory consistency constraints with a linear surrogate objective to enable efficient and stable single- and few-step generation. EMF features a unified training framework that eliminates the need for Jacobian-vector products and simultaneously supports both u-prediction and x₁-prediction, substantially reducing memory and computational overhead. Evaluated on image synthesis, particle geometry, and function generation tasks, EMF consistently improves sample quality and optimization stability under fixed sampling budgets, while cutting both training time and memory consumption by approximately 50%.

Technology Category

Application Category

📝 Abstract
We propose \emph{Euler Mean Flows (EMF)}, a flow-based generative framework for one-step and few-step generation that enforces long-range trajectory consistency with minimal sampling cost. The key idea of EMF is to replace the trajectory consistency constraint, which is difficult to supervise and optimize over long time scales, with a principled linear surrogate that enables direct data supervision for long-horizon flow-map compositions. We derive this approximation from the semigroup formulation of flow-based models and show that, under mild regularity assumptions, it faithfully approximates the original consistency objective while being substantially easier to optimize. This formulation leads to a unified, JVP-free training framework that supports both $u$-prediction and $x_1$-prediction variants, avoiding explicit Jacobian computations and significantly reducing memory and computational overhead. Experiments on image synthesis, particle-based geometry generation, and functional generation demonstrate improved optimization stability and sample quality under fixed sampling budgets, together with approximately $50\%$ reductions in training time and memory consumption compared to existing one-step methods for image generation.
Problem

Research questions and friction points this paper is trying to address.

trajectory consistency
one-step generation
flow-based generative models
long-horizon optimization
sampling efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Euler Mean Flows
trajectory consistency
one-step generation
JVP-free training
flow-based generative models
🔎 Similar Papers
No similar papers found.