🤖 AI Summary
Estimating clean-data covariance for training-free guidance in diffusion models remains challenging; existing methods require additional computation or architectural modifications. Method: This paper proposes a zero-overhead covariance estimation framework that establishes an analytical relationship between noisy observations and clean-data covariance via the second-order Tweedie formula. It introduces, for the first time, a cross-noise-level covariance transfer mechanism and a single-level low-rank update strategy. Contribution/Results: The method imposes no modifications to the training pipeline, denoiser architecture, or inference computational cost. It significantly outperforms recent baselines across diverse linear inverse problems—including super-resolution, denoising, and compressed sensing—especially under few-step sampling (10–25 steps), where it achieves average PSNR gains of 1.2–2.8 dB. Thus, it unifies high-fidelity covariance estimation with efficient reconstruction without introducing overhead.
📝 Abstract
The covariance for clean data given a noisy observation is an important quantity in many training-free guided generation methods for diffusion models. Current methods require heavy test-time computation, altering the standard diffusion training process or denoiser architecture, or making heavy approximations. We propose a new framework that sidesteps these issues by using covariance information that is available for free from training data and the curvature of the generative trajectory, which is linked to the covariance through the second-order Tweedie's formula. We integrate these sources of information using (i) a novel method to transfer covariance estimates across noise levels and (ii) low-rank updates in a given noise level. We validate the method on linear inverse problems, where it outperforms recent baselines, especially with fewer diffusion steps.