Single-Step Consistent Diffusion Samplers

πŸ“… 2025-02-11
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing diffusion-based methods for efficient sampling from unnormalized target distributions rely on multi-step iterative procedures, incurring substantial computational overhead. Method: This paper introduces the Consistent Diffusion Samplerβ€”a novel framework enabling single-step, high-fidelity sample generation. It eliminates intermediate sampling steps entirely via a self-consistency loss, supports both distillation of pre-trained diffusion models and end-to-end training from scratch, and innovatively leverages incomplete sampling trajectories and noisy intermediate states for modeling. Results: On diverse unnormalized distributions, our method achieves sample quality comparable to traditional multi-step diffusion while reducing network evaluations to less than 1%β€”a drastic improvement in sampling efficiency and practical applicability.

Technology Category

Application Category

πŸ“ Abstract
Sampling from unnormalized target distributions is a fundamental yet challenging task in machine learning and statistics. Existing sampling algorithms typically require many iterative steps to produce high-quality samples, leading to high computational costs that limit their practicality in time-sensitive or resource-constrained settings. In this work, we introduce consistent diffusion samplers, a new class of samplers designed to generate high-fidelity samples in a single step. We first develop a distillation algorithm to train a consistent diffusion sampler from a pretrained diffusion model without pre-collecting large datasets of samples. Our algorithm leverages incomplete sampling trajectories and noisy intermediate states directly from the diffusion process. We further propose a method to train a consistent diffusion sampler from scratch, fully amortizing exploration by training a single model that both performs diffusion sampling and skips intermediate steps using a self-consistency loss. Through extensive experiments on a variety of unnormalized distributions, we show that our approach yields high-fidelity samples using less than 1% of the network evaluations required by traditional diffusion samplers.
Problem

Research questions and friction points this paper is trying to address.

Challenges in sampling unnormalized distributions efficiently
High computational cost of iterative sampling algorithms
Need for single-step high-fidelity sample generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Single-step consistent diffusion samplers
Distillation algorithm without pre-collecting data
Self-consistency loss skips intermediate steps
πŸ”Ž Similar Papers
No similar papers found.