One-Step Diffusion Samplers via Self-Distillation and Deterministic Flow

📅 2025-12-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Efficient sampling from unnormalized target distributions remains challenging, particularly when requiring minimal network evaluations. Method: This paper proposes the One-Step Diffusion Sampler, which generates high-fidelity samples in a single network evaluation by modeling step-size–conditioned ODEs, employing self-distilled consistency learning, and applying deterministic flow-based importance weighting. To address ELBO estimation degradation under ultra-low-step sampling, we introduce volumetric consistency regularization—a novel regularizer that jointly enforces sample quality and robust evidence lower bound estimation. Contribution/Results: Our method achieves state-of-the-art sample quality on synthetic benchmarks and Bayesian inference tasks, while reducing network evaluations by one to two orders of magnitude compared to existing diffusion samplers. It significantly improves both sampling efficiency and stability, marking the first approach to simultaneously guarantee high sample fidelity and reliable ELBO estimation in one-step diffusion sampling.

Technology Category

Application Category

📝 Abstract
Sampling from unnormalized target distributions is a fundamental yet challenging task in machine learning and statistics. Existing sampling algorithms typically require many iterative steps to produce high-quality samples, leading to high computational costs. We introduce one-step diffusion samplers which learn a step-conditioned ODE so that one large step reproduces the trajectory of many small ones via a state-space consistency loss. We further show that standard ELBO estimates in diffusion samplers degrade in the few-step regime because common discrete integrators yield mismatched forward/backward transition kernels. Motivated by this analysis, we derive a deterministic-flow (DF) importance weight for ELBO estimation without a backward kernel. To calibrate DF, we introduce a volume-consistency regularization that aligns the accumulated volume change along the flow across step resolutions. Our proposed sampler therefore achieves both sampling and stable evidence estimate in only one or few steps. Across challenging synthetic and Bayesian benchmarks, it achieves competitive sample quality with orders-of-magnitude fewer network evaluations while maintaining robust ELBO estimates.
Problem

Research questions and friction points this paper is trying to address.

Develop one-step diffusion samplers via self-distillation and deterministic flow
Address computational inefficiency of iterative sampling from unnormalized distributions
Enable stable evidence estimation with few-step sampling and volume consistency
Innovation

Methods, ideas, or system contributions that make the work stand out.

One-step diffusion samplers via self-distillation
Deterministic-flow importance weight for ELBO estimation
Volume-consistency regularization for stable evidence estimate
🔎 Similar Papers
No similar papers found.