🤖 AI Summary
In underdetermined compressive imaging, conventional message-passing algorithms rely on handcrafted priors that inadequately capture the complex statistical structure of natural images, while posterior sampling with score-based generative models incurs prohibitive computational cost. To address these limitations, this paper proposes the Score-based Turbo Message Passing (STMP) framework, integrating score-based generative priors into a turbo message-passing architecture. Key contributions include: (i) establishing the first theoretical connection between score-based MMSE denoisers and empirical Bayesian estimation; (ii) designing STMP and its quantization-robust variant Q-STMP; and (iii) deriving state evolution equations that provably predict convergence behavior. Experiments on the FFHQ dataset demonstrate that STMP significantly outperforms existing plug-and-play (PnP) methods. Both STMP and Q-STMP typically converge within ≤10 iterations, and Q-STMP maintains strong reconstruction robustness even under 1-bit quantization.
📝 Abstract
Message-passing algorithms have been adapted for compressive imaging by incorporating various off-the-shelf image denoisers. However, these denoisers rely largely on generic or hand-crafted priors and often fall short in accurately capturing the complex statistical structure of natural images. As a result, traditional plug-and-play (PnP) methods often lead to suboptimal reconstruction, especially in highly underdetermined regimes. Recently, score-based generative models have emerged as a powerful framework for accurately characterizing sophisticated image distribution. Yet, their direct use for posterior sampling typically incurs prohibitive computational complexity. In this paper, by exploiting the close connection between score-based generative modeling and empirical Bayes denoising, we devise a message-passing framework that integrates a score-based minimum mean-squared error (MMSE) denoiser for compressive image recovery. The resulting algorithm, named score-based turbo message passing (STMP), combines the fast convergence of message passing with the expressive power of score-based generative priors. For practical systems with quantized measurements, we further propose quantized STMP (Q-STMP), which augments STMP with a component-wise MMSE dequantization module. We demonstrate that the asymptotic performance of STMP and Q-STMP can be accurately predicted by a set of state-evolution (SE) equations. Experiments on the FFHQ dataset demonstrate that STMP strikes a significantly better performance-complexity tradeoff compared with competing baselines, and that Q-STMP remains robust even under 1-bit quantization. Remarkably, both STMP and Q-STMP typically converge within 10 iterations.