🤖 AI Summary
This work addresses the significant noise introduced by approximating likelihood gradients in guided diffusion sampling, which undermines sampling stability and generation quality. The paper proposes the first concise and effective integration of adaptive moment estimation—inspired by Adam-like optimizers—into the guidance phase of Plug-and-Play diffusion sampling to dynamically denoise and stabilize noisy likelihood gradients. Requiring no computationally expensive operations, the method substantially outperforms existing costly approaches and achieves state-of-the-art performance on both image restoration and class-conditional generation tasks. By enhancing gradient alignment and sample fidelity, the proposed technique effectively improves the overall quality and reliability of generated samples.
📝 Abstract
Guided diffusion sampling relies on approximating often intractable likelihood scores, which introduces significant noise into the sampling dynamics. We propose using adaptive moment estimation to stabilize these noisy likelihood scores during sampling. Despite its simplicity, our approach achieves state-of-the-art results on image restoration and class-conditional generation tasks, outperforming more complicated methods, which are often computationally more expensive. We provide empirical analysis of our method on both synthetic and real data, demonstrating that mitigating gradient noise through adaptive moments offers an effective way to improve alignment.