Training-Free Distribution Adaptation for Diffusion Models via Maximum Mean Discrepancy Guidance

📅 2026-01-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of distribution shift in pre-trained diffusion models under domain adaptation scenarios, where limited target reference samples and the inability to retrain often lead to biased generation. The authors propose a novel inference-time approach that guides the reverse diffusion process using gradients derived from Maximum Mean Discrepancy (MMD). For the first time, MMD is employed as a differentiable, low-variance distributional distance metric directly within diffusion guidance, enabling precise alignment between the generated distribution and a given target reference set without any retraining. The method naturally supports prompt-aware adaptation in conditional generation and extends efficiently to latent diffusion models (LDMs). Experiments on both synthetic and real-world benchmarks demonstrate its effectiveness in achieving distribution alignment while preserving high sample quality and fidelity.

Technology Category

Application Category

📝 Abstract
Pre-trained diffusion models have emerged as powerful generative priors for both unconditional and conditional sample generation, yet their outputs often deviate from the characteristics of user-specific target data. Such mismatches are especially problematic in domain adaptation tasks, where only a few reference examples are available and retraining the diffusion model is infeasible. Existing inference-time guidance methods can adjust sampling trajectories, but they typically optimize surrogate objectives such as classifier likelihoods rather than directly aligning with the target distribution. We propose MMD Guidance, a training-free mechanism that augments the reverse diffusion process with gradients of the Maximum Mean Discrepancy (MMD) between generated samples and a reference dataset. MMD provides reliable distributional estimates from limited data, exhibits low variance in practice, and is efficiently differentiable, which makes it particularly well-suited for the guidance task. Our framework naturally extends to prompt-aware adaptation in conditional generation models via product kernels. Also, it can be applied with computational efficiency in latent diffusion models (LDMs), since guidance is applied in the latent space of the LDM. Experiments on synthetic and real-world benchmarks demonstrate that MMD Guidance can achieve distributional alignment while preserving sample fidelity.
Problem

Research questions and friction points this paper is trying to address.

distribution adaptation
diffusion models
domain mismatch
few-shot adaptation
target distribution alignment
Innovation

Methods, ideas, or system contributions that make the work stand out.

MMD Guidance
distribution adaptation
diffusion models
training-free
maximum mean discrepancy
🔎 Similar Papers
No similar papers found.