General and Efficient Steering of Unconditional Diffusion

πŸ“… 2026-02-11
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work proposes an efficient and general guidance framework for unconditional diffusion models that eliminates the need for gradient computation during inference, circumventing the high computational cost of existing methods that rely on retraining or per-step gradient-based guidance. The approach leverages offline-learned, fixed steering vectors directly injected into the diffusion process to enable fast and controllable generation. It is grounded in two key insights: noise alignment and transferable concept vectors, which ensure a single steering vector remains effective across timesteps and samples. Concept directions in activation space are learned using Recursive Feature Machines (RFMs) without backpropagation. Experiments on CIFAR-10, ImageNet, and CelebA demonstrate that the method outperforms gradient-based guidance baselines while significantly accelerating inference.

Technology Category

Application Category

πŸ“ Abstract
Guiding unconditional diffusion models typically requires either retraining with conditional inputs or per-step gradient computations (e.g., classifier-based guidance), both of which incur substantial computational overhead. We present a general recipe for efficiently steering unconditional diffusion {without gradient guidance during inference}, enabling fast controllable generation. Our approach is built on two observations about diffusion model structure: Noise Alignment: even in early, highly corrupted stages, coarse semantic steering is possible using a lightweight, offline-computed guidance signal, avoiding any per-step or per-sample gradients. Transferable concept vectors: a concept direction in activation space once learned transfers across both {timesteps} and {samples}; the same fixed steering vector learned near low noise level remains effective when injected at intermediate noise levels for every generation trajectory, providing refined conditional control with efficiency. Such concept directions can be efficiently and reliably identified via Recursive Feature Machine (RFM), a light-weight backpropagation-free feature learning method. Experiments on CIFAR-10, ImageNet, and CelebA demonstrate improved accuracy/quality over gradient-based guidance, while achieving significant inference speedups.
Problem

Research questions and friction points this paper is trying to address.

unconditional diffusion
efficient steering
gradient-free guidance
controllable generation
diffusion model
Innovation

Methods, ideas, or system contributions that make the work stand out.

diffusion steering
gradient-free guidance
concept vectors
noise alignment
Recursive Feature Machine
πŸ”Ž Similar Papers
No similar papers found.