Constant Rate Schedule: Constant-Rate Distributional Change for Efficient Training and Sampling in Diffusion Models

📅 2024-11-19
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the inefficiency in training and limited sampling quality of diffusion models caused by non-uniform evolution rates of the forward-process probability distributions. We propose an adaptive noise scheduling method that explicitly models and enforces a constant distribution evolution rate. To our knowledge, this is the first approach enabling explicit control over the forward-process distribution dynamics—quantifying distributional discrepancies via Wasserstein distance or KL divergence and employing an adaptive functional fit to automatically accommodate diverse datasets (e.g., LSUN, ImageNet, FFHQ) and model spaces (pixel or latent). The scheduler integrates seamlessly into mainstream frameworks including DDPM, DDIM, and LCM. Experiments demonstrate over 50% reduction in sampling steps while preserving generation fidelity; significant improvements in FID and LPIPS metrics are observed, alongside strong cross-dataset and cross-architecture generalization.

Technology Category

Application Category

📝 Abstract
We propose a noise schedule that ensures a constant rate of change in the probability distribution of diffused data throughout the diffusion process. To obtain this schedule, we measure the probability-distributional change of diffused data by simulating the forward process and use it to determine the noise schedule before training diffusion models. The functional form of the noise schedule is automatically determined and tailored to each dataset and type of diffusion model, such as pixel space or latent space. We evaluate the effectiveness of our noise schedule on unconditional and class-conditional image generation tasks using the LSUN (Bedroom, Church, Cat, Horse), ImageNet, and FFHQ datasets. Through extensive experiments, we confirmed that our noise schedule broadly improves the performance of the pixel-space and latent-space diffusion models regardless of the dataset, sampler, and number of function evaluations.
Problem

Research questions and friction points this paper is trying to address.

Constant rate of distributional change
Efficient training and sampling
Tailored noise schedule for datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Constant-rate noise schedule
Automated schedule determination
Improved diffusion model performance
🔎 Similar Papers
No similar papers found.