Information-Guided Noise Allocation for Efficient Diffusion Training

📅 2026-02-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional diffusion models rely on handcrafted noise schedules, which often waste computational resources in low-information regions and exhibit poor generalization. This work proposes InfoNoise, the first noise scheduling mechanism grounded in information theory that adapts to data by dynamically guiding the noise sampling distribution through estimation of the conditional entropy rate in the forward process, thereby replacing heuristic design. Requiring no manual hyperparameter tuning, InfoNoise matches or surpasses the performance of carefully tuned EDM schedules on natural images and accelerates CIFAR-10 training by 1.4×. On discrete data, it achieves superior generation quality with up to three times fewer training steps.

Technology Category

Application Category

📝 Abstract
Training diffusion models typically relies on manually tuned noise schedules, which can waste computation on weakly informative noise regions and limit transfer across datasets, resolutions, and representations. We revisit noise schedule allocation through an information-theoretic lens and propose the conditional entropy rate of the forward process as a theoretically grounded, data-dependent diagnostic for identifying suboptimal noise-level allocation in existing schedules. Based on these insight, we introduce InfoNoise, a principled data-adaptive training noise schedule that replaces heuristic schedule design with an information-guided noise sampling distribution derived from entropy-reduction rates estimated from denoising losses already computed during training. Across natural-image benchmarks, InfoNoise matches or surpasses tuned EDM-style schedules, in some cases with a substantial training speedup (about $1.4\times$ on CIFAR-10). On discrete datasets, where standard image-tuned schedules exhibit significant mismatch, it reaches superior quality in up to $3\times$ fewer training steps. Overall, InfoNoise makes noise scheduling data-adaptive, reducing the need for per-dataset schedule design as diffusion models expand across domains.
Problem

Research questions and friction points this paper is trying to address.

diffusion models
noise schedule
information theory
data-adaptive training
conditional entropy
Innovation

Methods, ideas, or system contributions that make the work stand out.

diffusion models
noise schedule
information theory
conditional entropy rate
data-adaptive training
🔎 Similar Papers
No similar papers found.