Adaptive Cyclic Diffusion for Inference Scaling

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Diffusion models lack the capability to dynamically allocate computational resources per sample based on input difficulty, relying instead on fixed denoising steps. This work introduces an adaptive inference-time scaling paradigm, proposing the Adaptive Bidirectional Cycling Diffusion (ABCD) framework. ABCD features variable-depth bidirectional denoising cycles, a Monte Carlo Tree Search–based cycle search mechanism, an exploration-exploitation balancing strategy, and an online difficulty-aware dynamic termination controller—enabling real-time, sample-specific computational scheduling. Key contributions include: (i) the first formal definition of adaptive computation scheduling at inference time for diffusion models; (ii) the introduction of cyclic search and adaptive “thinking time” mechanisms. Evaluated across multi-task benchmarks, ABCD achieves an average 12.7% performance gain under matched FLOPs and reduces inference steps by up to 38%.

Technology Category

Application Category

📝 Abstract
Diffusion models have demonstrated strong generative capabilities across domains ranging from image synthesis to complex reasoning tasks. However, most inference-time scaling methods rely on fixed denoising schedules, limiting their ability to allocate computation based on instance difficulty or task-specific demands adaptively. We introduce the challenge of adaptive inference-time scaling-dynamically adjusting computational effort during inference-and propose Adaptive Bi-directional Cyclic Diffusion (ABCD), a flexible, search-based inference framework. ABCD refines outputs through bi-directional diffusion cycles while adaptively controlling exploration depth and termination. It comprises three components: Cyclic Diffusion Search, Automatic Exploration-Exploitation Balancing, and Adaptive Thinking Time. Experiments show that ABCD improves performance across diverse tasks while maintaining computational efficiency.
Problem

Research questions and friction points this paper is trying to address.

Adaptive inference-time scaling for diffusion models
Dynamic computational effort adjustment during inference
Balancing exploration and exploitation in diffusion cycles
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bi-directional diffusion cycles refine outputs adaptively
Automatic exploration-exploitation balancing optimizes computation
Adaptive thinking time controls termination dynamically
🔎 Similar Papers
No similar papers found.
Gyubin Lee
Gyubin Lee
KAIST
T
Truong Nhat Nguyen Bao
KAIST
Jaesik Yoon
Jaesik Yoon
SAP, KAIST
machine learningartificial intelligence
D
Dongwoo Lee
KAIST
M
Minsu Kim
Mila – Quebec AI Institute, KAIST
Y
Y. Bengio
Mila – Quebec AI Institute, Université de Montréal
Sungjin Ahn
Sungjin Ahn
Associate Professor, KAIST
Machine LearningDeep LearningReinforcement LearningAICognitive Science