Few-Step Diffusion Sampling Through Instance-Aware Discretizations

๐Ÿ“… 2026-03-18
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing diffusion model sampling methods typically employ a globally uniform time-step schedule, disregarding the varying generation complexity across individual samples and thereby limiting the quality of few-step sampling. This work proposes an instance-aware discretization framework that, for the first time, introduces sample-adaptive mechanisms into time-step scheduling. By leveraging input-dependent priors to dynamically adjust time-step allocation and extending gradient-based discretization search to conditional generation settings, the method achieves significant improvements in few-step sampling quality across diverse tasksโ€”including synthetic data, pixel-space images, latent-space images, and video generation. Built upon probability flow ODEs and numerical solvers, the approach incurs minimal tuning cost and negligible inference overhead.

Technology Category

Application Category

๐Ÿ“ Abstract
Diffusion and flow matching models generate high-fidelity data by simulating paths defined by Ordinary or Stochastic Differential Equations (ODEs/SDEs), starting from a tractable prior distribution. The probability flow ODE formulation enables the use of advanced numerical solvers to accelerate sampling. Orthogonal yet vital to solver design is the discretization strategy. While early approaches employed handcrafted heuristics and recent methods adopt optimization-based techniques, most existing strategies enforce a globally shared timestep schedule across all samples. This uniform treatment fails to account for instance-specific complexity in the generative process, potentially limiting performance. Motivated by controlled experiments on synthetic data, which reveals the suboptimality of global schedules under instance-specific dynamics, we propose an instance-aware discretization framework. Our method learns to adapt timestep allocations based on input-dependent priors, extending gradient-based discretization search to the conditional generative setting. Empirical results across diverse settings, including synthetic data, pixel-space diffusion, latent-space images and video flow matching models, demonstrate that our method consistently improves generation quality with marginal tuning cost compared to training and negligible inference overhead.
Problem

Research questions and friction points this paper is trying to address.

diffusion sampling
instance-aware discretization
timestep scheduling
generative modeling
flow matching
Innovation

Methods, ideas, or system contributions that make the work stand out.

instance-aware discretization
few-step diffusion sampling
adaptive timestep scheduling
flow matching
conditional generative modeling
๐Ÿ”Ž Similar Papers
No similar papers found.