Distilling Diversity and Control in Diffusion Models

📅 2025-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Diffusion model distillation commonly suffers from severe sample diversity collapse, substantially underperforming the base model. This work identifies the root cause as distortion of concept-level representation structures during distillation and discovers that concept representations exhibit cross-model transferability. Based on these insights, we propose a dual-paradigm distillation framework: (1) *Controlled Distillation*, enabling controllable generation via zero-shot transfer of Concept Sliders or LoRAs; and (2) *Diversity Distillation*, injecting diversity by invoking the base model only at the first sampling step. Integrated with Diffusion Target visualization analysis, hybrid inference scheduling, and concept slider adaptation, our method fully restores—and even surpasses—the base model’s diversity without modifying model architecture or requiring additional training. Crucially, it retains over 98% of the distilled model’s inference efficiency.

Technology Category

Application Category

📝 Abstract
Distilled diffusion models suffer from a critical limitation: reduced sample diversity compared to their base counterparts. In this work, we uncover that despite this diversity loss, distilled models retain the fundamental concept representations of base models. We demonstrate control distillation - where control mechanisms like Concept Sliders and LoRAs trained on base models can be seamlessly transferred to distilled models and vice-versa, effectively distilling control without any retraining. This preservation of representational structure prompted our investigation into the mechanisms of diversity collapse during distillation. To understand how distillation affects diversity, we introduce Diffusion Target (DT) Visualization, an analysis and debugging tool that reveals how models predict final outputs at intermediate steps. Through DT-Visualization, we identify generation artifacts, inconsistencies, and demonstrate that initial diffusion timesteps disproportionately determine output diversity, while later steps primarily refine details. Based on these insights, we introduce diversity distillation - a hybrid inference approach that strategically employs the base model for only the first critical timestep before transitioning to the efficient distilled model. Our experiments demonstrate that this simple modification not only restores the diversity capabilities from base to distilled models but surprisingly exceeds it, while maintaining nearly the computational efficiency of distilled inference, all without requiring additional training or model modifications. Our code and data are available at https://distillation.baulab.info
Problem

Research questions and friction points this paper is trying to address.

Reduced sample diversity in distilled diffusion models
Transferring control mechanisms between base and distilled models
Understanding and mitigating diversity collapse during distillation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Control distillation transfers mechanisms without retraining.
DT-Visualization analyzes diversity collapse in distillation.
Diversity distillation hybridizes base and distilled models.
🔎 Similar Papers
No similar papers found.