Implicit Diffusion: Efficient Optimization through Stochastic Sampling

πŸ“… 2024-02-08
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 10
✨ Influential: 3
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the optimization of parameterized implicit stochastic diffusion processes. We propose the first single-loop, first-order framework that jointly optimizes model parameters and performs sampling, formalizing sampling as an optimization problem over the space of probability distributions. Our approach innovatively integrates bilevel optimization with automatic implicit differentiation: by leveraging implicit differentiation, we perform end-to-end gradient-based optimization of diffusion parameters without explicitly differentiating through the inner sampling iterations. We establish theoretical convergence guarantees for the proposed algorithm. Empirically, on energy-based model training and denoising diffusion fine-tuning tasks, our method achieves significant improvements in both optimization efficiency and generative performance, while maintaining computational feasibility and statistical validity.

Technology Category

Application Category

πŸ“ Abstract
We present a new algorithm to optimize distributions defined implicitly by parameterized stochastic diffusions. Doing so allows us to modify the outcome distribution of sampling processes by optimizing over their parameters. We introduce a general framework for first-order optimization of these processes, that performs jointly, in a single loop, optimization and sampling steps. This approach is inspired by recent advances in bilevel optimization and automatic implicit differentiation, leveraging the point of view of sampling as optimization over the space of probability distributions. We provide theoretical guarantees on the performance of our method, as well as experimental results demonstrating its effectiveness. We apply it to training energy-based models and finetuning denoising diffusions.
Problem

Research questions and friction points this paper is trying to address.

Optimize distributions defined by stochastic diffusions.
Modify outcome distributions via parameter optimization.
Train energy-based models and fine-tune denoising diffusions.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimizes distributions via stochastic diffusions
Combines optimization and sampling in one loop
Applies to energy models and denoising diffusions
πŸ”Ž Similar Papers
No similar papers found.