🤖 AI Summary
Diffusion models suffer from model collapse when recursively trained on synthetic data; existing work primarily addresses variance shrinkage or distributional shift, overlooking the underlying behavioral degradation mechanism. This paper identifies the core cause of collapse as a fundamental shift from generalizable generation to memorized reproduction—driven by progressive entropy reduction in synthetic data. To counteract this, we propose an entropy-based data filtering strategy that dynamically discards low-entropy (highly repetitive) samples during iterative training. Experiments demonstrate that our approach significantly delays performance degradation: under multiple rounds of recursive training, it robustly preserves both visual fidelity and diversity of generated images. Unlike heuristic or regularization-based methods, our framework offers an interpretable, principled, and operationally simple paradigm for mitigating diffusion model collapse.
📝 Abstract
The widespread use of diffusion models has led to an abundance of AI-generated data, raising concerns about model collapse -- a phenomenon in which recursive iterations of training on synthetic data lead to performance degradation. Prior work primarily characterizes this collapse via variance shrinkage or distribution shift, but these perspectives miss practical manifestations of model collapse. This paper identifies a transition from generalization to memorization during model collapse in diffusion models, where models increasingly replicate training data instead of generating novel content during iterative training on synthetic samples. This transition is directly driven by the declining entropy of the synthetic training data produced in each training cycle, which serves as a clear indicator of model degradation. Motivated by this insight, we propose an entropy-based data selection strategy to mitigate the transition from generalization to memorization and alleviate model collapse. Empirical results show that our approach significantly enhances visual quality and diversity in recursive generation, effectively preventing collapse.