Domain Guidance: A Simple Transfer Approach for a Pre-trained Diffusion Model

📅 2025-04-02
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Pretrained diffusion models suffer from high fine-tuning costs and inflexible deployment when adapting to target domains for personalized generation. To address this, we propose Domain Guidance—a lightweight, fine-tuning-free conditional generation method that injects domain priors solely during sampling. It formulates domain guidance as a sampling-level modulation mechanism, unified within the classifier-free guidance framework, enabling zero-shot, plug-and-play deployment without model weight updates or retraining. Our approach constructs modular guidance terms by fusing pre-extracted domain features (e.g., from DINOv2), requiring no architectural or training pipeline modifications. Theoretical analysis elucidates how Domain Guidance improves distribution alignment and semantic fidelity. Experiments across multiple benchmarks demonstrate significant gains over standard fine-tuning: a 19.6% reduction in FID and a 23.4% improvement in FD$_ ext{DINOv2}$. Notably, existing fine-tuned models can directly leverage Domain Guidance for immediate performance enhancement.

Technology Category

Application Category

📝 Abstract
Recent advancements in diffusion models have revolutionized generative modeling. However, the impressive and vivid outputs they produce often come at the cost of significant model scaling and increased computational demands. Consequently, building personalized diffusion models based on off-the-shelf models has emerged as an appealing alternative. In this paper, we introduce a novel perspective on conditional generation for transferring a pre-trained model. From this viewpoint, we propose *Domain Guidance*, a straightforward transfer approach that leverages pre-trained knowledge to guide the sampling process toward the target domain. Domain Guidance shares a formulation similar to advanced classifier-free guidance, facilitating better domain alignment and higher-quality generations. We provide both empirical and theoretical analyses of the mechanisms behind Domain Guidance. Our experimental results demonstrate its substantial effectiveness across various transfer benchmarks, achieving over a 19.6% improvement in FID and a 23.4% improvement in FD$_ ext{DINOv2}$ compared to standard fine-tuning. Notably, existing fine-tuned models can seamlessly integrate Domain Guidance to leverage these benefits, without additional training.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational costs in diffusion models
Improving domain alignment in generative models
Enhancing transfer learning for pre-trained models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Domain Guidance for pre-trained model transfer
Leverages pre-trained knowledge for target domain
Improves FID and FD without retraining
🔎 Similar Papers
No similar papers found.