Universal Inverse Distillation for Matching Models with Real-Data Supervision (No GANs)

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Generative models—including diffusion models, flow matching, and related frameworks—suffer from slow inference. Existing knowledge distillation methods are either framework-specific or rely on data-free paradigms; incorporating real data typically necessitates complex adversarial training. Method: We propose the first general-purpose one-step distillation framework, unifying diverse matching-based generative models (e.g., diffusion, flow matching, bridge matching, and stochastic interpolation). Grounded in reverse distillation theory, we introduce a trajectory alignment loss that directly integrates real-data supervision—without GANs or discriminators. Contribution/Results: Our method achieves high-fidelity single-step generation across multiple tasks, significantly accelerating inference while preserving cross-model generalizability and stability. It overcomes dual limitations of conventional distillation: dependence on model-specific architectures and restrictive data strategies.

Technology Category

Application Category

📝 Abstract
While achieving exceptional generative quality, modern diffusion, flow, and other matching models suffer from slow inference, as they require many steps of iterative generation. Recent distillation methods address this by training efficient one-step generators under the guidance of a pre-trained teacher model. However, these methods are often constrained to only one specific framework, e.g., only to diffusion or only to flow models. Furthermore, these methods are naturally data-free, and to benefit from the usage of real data, it is required to use an additional complex adversarial training with an extra discriminator model. In this paper, we present RealUID, a universal distillation framework for all matching models that seamlessly incorporates real data into the distillation procedure without GANs. Our RealUID approach offers a simple theoretical foundation that covers previous distillation methods for Flow Matching and Diffusion models, and is also extended to their modifications, such as Bridge Matching and Stochastic Interpolants.
Problem

Research questions and friction points this paper is trying to address.

Universal distillation framework for all matching models without GANs
Addresses slow inference in diffusion and flow matching models
Incorporates real data supervision during distillation process
Innovation

Methods, ideas, or system contributions that make the work stand out.

Universal distillation framework for matching models
Incorporates real data without GANs
Covers Flow Matching and Diffusion models
🔎 Similar Papers
No similar papers found.