🤖 AI Summary
This work addresses signal recovery under generative neural network priors, breaking from existing theoretical assumptions that require fully connected architectures and i.i.d. Gaussian weights. We establish, for the first time, a rigorous state evolution (SE) analysis framework for random convolutional generative priors. Methodologically, we prove that random convolutional layers belong to the universality class of Gaussian matrices, construct an exact mapping between such layers and spatially coupled sensing matrices, and integrate multilayer approximate message passing (ML-AMP), random matrix theory, and SE analysis to derive precise asymptotic characterizations. Our key contribution is the first provably convergent SE theory for convolutional generative priors—ubiquitous in practice—thereby substantially broadening the scope and theoretical foundation of generative prior-based signal recovery. This advances the unification of deep generative models and sparse coding theory.
📝 Abstract
Signal recovery under generative neural network priors has emerged as a promising direction in statistical inference and computational imaging. Theoretical analysis of reconstruction algorithms under generative priors is, however, challenging. For generative priors with fully connected layers and Gaussian i.i.d. weights, this was achieved by the multi-layer approximate message (ML-AMP) algorithm via a rigorous state evolution. However, practical generative priors are typically convolutional, allowing for computational benefits and inductive biases, and so the Gaussian i.i.d. weight assumption is very limiting. In this paper, we overcome this limitation and establish the state evolution of ML-AMP for random convolutional layers. We prove in particular that random convolutional layers belong to the same universality class as Gaussian matrices. Our proof technique is of an independent interest as it establishes a mapping between convolutional matrices and spatially coupled sensing matrices used in coding theory.