Boost-and-Skip: A Simple Guidance-Free Diffusion for Minority Generation

📅 2025-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Generating high-quality samples for minority classes remains challenging due to their sparsity in low-density manifold regions, while existing diffusion-based approaches rely heavily on computationally expensive classifier guidance. Method: This paper proposes an unconditional, lightweight diffusion generative framework. Its core innovations are: (1) variance-enhanced initialization, which strengthens the initial noise representation of minority-class features; and (2) a timestep-skipping mechanism, integrating theory-driven noise covariance rescaling and non-uniform sampling scheduling—achieving improved generation efficiency and fidelity without adding network parameters or external guidance signals. Results: Evaluated across multiple benchmarks, our method surpasses state-of-the-art guided diffusion models in FID, diversity, and other metrics, while reducing computational overhead by over 60%. To our knowledge, it is the first approach to achieve simultaneous breakthroughs in both quality and efficiency for minority-class generation without any guidance.

Technology Category

Application Category

📝 Abstract
Minority samples are underrepresented instances located in low-density regions of a data manifold, and are valuable in many generative AI applications, such as data augmentation, creative content generation, etc. Unfortunately, existing diffusion-based minority generators often rely on computationally expensive guidance dedicated for minority generation. To address this, here we present a simple yet powerful guidance-free approach called Boost-and-Skip for generating minority samples using diffusion models. The key advantage of our framework requires only two minimal changes to standard generative processes: (i) variance-boosted initialization and (ii) timestep skipping. We highlight that these seemingly-trivial modifications are supported by solid theoretical and empirical evidence, thereby effectively promoting emergence of underrepresented minority features. Our comprehensive experiments demonstrate that Boost-and-Skip greatly enhances the capability of generating minority samples, even rivaling guidance-based state-of-the-art approaches while requiring significantly fewer computations.
Problem

Research questions and friction points this paper is trying to address.

Generates minority samples using diffusion models
Eliminates need for computationally expensive guidance
Enhances minority feature emergence with minimal changes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Guidance-free diffusion model
Variance-boosted initialization technique
Timestep skipping strategy
🔎 Similar Papers
No similar papers found.