🤖 AI Summary
Existing diffusion models suffer from unstable sampling dynamics and implicit infinite propagation speed during reverse sampling, leading to unbounded kinetic energy and numerical instability. To address this, we propose DistillKac—a novel generative framework grounded in the damped wave equation and its stochastic Kac representation—enabling a probabilistic transport process with strictly finite propagation speed, thereby ensuring globally bounded kinetic energy and pathwise stability. DistillKac innovatively implements classifier-free guidance directly in velocity space and introduces an endpoint distillation strategy to enhance sampling fidelity under ultra-low function evaluations (<10). Theoretical analysis guarantees both path approximation accuracy and overall dynamical stability. Experiments demonstrate that DistillKac achieves high-fidelity image generation at minimal computational cost, offering superior efficiency, robustness, and physical interpretability compared to conventional diffusion models.
📝 Abstract
We present DistillKac, a fast image generator that uses the damped wave equation and its stochastic Kac representation to move probability mass at finite speed. In contrast to diffusion models whose reverse time velocities can become stiff and implicitly allow unbounded propagation speed, Kac dynamics enforce finite speed transport and yield globally bounded kinetic energy. Building on this structure, we introduce classifier-free guidance in velocity space that preserves square integrability under mild conditions. We then propose endpoint only distillation that trains a student to match a frozen teacher over long intervals. We prove a stability result that promotes supervision at the endpoints to closeness along the entire path. Experiments demonstrate DistillKac delivers high quality samples with very few function evaluations while retaining the numerical stability benefits of finite speed probability flows.