No Training, No Problem: Rethinking Classifier-Free Guidance for Diffusion Models

📅 2024-07-02
🏛️ International Conference on Learning Representations
📈 Citations: 12
Influential: 1
📄 PDF
🤖 AI Summary
Existing conditional diffusion models predominantly rely on classifier-free guidance (CFG), which necessitates either training an unconditional branch or modifying the training procedure—and thus cannot be applied to purely unconditional models. This work proposes two zero-training-overhead, general-purpose guidance methods: Independent Conditional Guidance (ICG) and Timestep Guidance (TSG). ICG decouples conditional signals from timestep features, enabling high-fidelity generation on any pre-trained conditional diffusion model without retraining—matching CFG’s performance. TSG innovatively extends guidance to unconditional models by injecting controllable priors through timestep encoding, substantially improving sample quality. Both methods support dynamic weight adjustment during sampling and incur computational overhead comparable to CFG. To our knowledge, this is the first work enabling plug-and-play, training-free guidance for *arbitrary* pre-trained diffusion models—including purely unconditional ones—without architectural or training modifications.

Technology Category

Application Category

📝 Abstract
Classifier-free guidance (CFG) has become the standard method for enhancing the quality of conditional diffusion models. However, employing CFG requires either training an unconditional model alongside the main diffusion model or modifying the training procedure by periodically inserting a null condition. There is also no clear extension of CFG to unconditional models. In this paper, we revisit the core principles of CFG and introduce a new method, independent condition guidance (ICG), which provides the benefits of CFG without the need for any special training procedures. Our approach streamlines the training process of conditional diffusion models and can also be applied during inference on any pre-trained conditional model. Additionally, by leveraging the time-step information encoded in all diffusion networks, we propose an extension of CFG, called time-step guidance (TSG), which can be applied to any diffusion model, including unconditional ones. Our guidance techniques are easy to implement and have the same sampling cost as CFG. Through extensive experiments, we demonstrate that ICG matches the performance of standard CFG across various conditional diffusion models. Moreover, we show that TSG improves generation quality in a manner similar to CFG, without relying on any conditional information.
Problem

Research questions and friction points this paper is trying to address.

Eliminates need for special training in diffusion models
Extends guidance to unconditional diffusion models
Simplifies training and enhances generation quality
Innovation

Methods, ideas, or system contributions that make the work stand out.

Independent condition guidance replaces classifier-free guidance
Time-step guidance extends CFG to unconditional models
Same sampling cost as CFG, no special training needed
🔎 Similar Papers
No similar papers found.