🤖 AI Summary
This work proposes an iterative framework for co-evolving datasets and diffusion models to address performance degradation caused by heterogeneous sample quality in modern datasets. By integrating a non-destructive data recycling mechanism with a noise-robust training strategy, the method synthesizes higher-quality samples at controlled noise levels in each iteration and effectively handles noisy data through Ambient Diffusion. Theoretical analysis and extensive experiments demonstrate that the proposed framework achieves state-of-the-art performance across diverse tasks, including unconditional and text-conditioned image generation as well as de novo protein design, significantly enhancing the robustness and generalization capabilities of generative models.
📝 Abstract
We propose Ambient Dataloops, an iterative framework for refining datasets that makes it easier for diffusion models to learn the underlying data distribution. Modern datasets contain samples of highly varying quality, and training directly on such heterogeneous data often yields suboptimal models. We propose a dataset-model co-evolution process; at each iteration of our method, the dataset becomes progressively higher quality, and the model improves accordingly. To avoid destructive self-consuming loops, at each generation, we treat the synthetically improved samples as noisy, but at a slightly lower noisy level than the previous iteration, and we use Ambient Diffusion techniques for learning under corruption. Empirically, Ambient Dataloops achieve state-of-the-art performance in unconditional and text-conditional image generation and de novo protein design. We further provide a theoretical justification for the proposed framework that captures the benefits of the data looping procedure.