Latent Generative Models with Tunable Complexity for Compressed Sensing and other Inverse Problems

📅 2026-03-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of fixed-complexity generative priors in inverse problems, which often suffer from insufficient expressiveness or overfitting to noise. The authors propose an adaptive-complexity generative prior that dynamically modulates the capacity of diffusion models, normalizing flows, and variational autoencoders via a nested dropout mechanism, tailoring model complexity to the specific demands of each inverse problem. This approach is the first to enable continuous and controllable complexity adjustment across multiple generative model families, accompanied by theoretical guarantees in linear denoising settings. Experimental results demonstrate significant improvements over fixed-complexity baselines across diverse tasks—including compressive sensing, image inpainting, denoising, and phase retrieval—with markedly reduced reconstruction errors.

Technology Category

Application Category

📝 Abstract
Generative models have emerged as powerful priors for solving inverse problems. These models typically represent a class of natural signals using a single fixed complexity or dimensionality. This can be limiting: depending on the problem, a fixed complexity may result in high representation error if too small, or overfitting to noise if too large. We develop tunable-complexity priors for diffusion models, normalizing flows, and variational autoencoders, leveraging nested dropout. Across tasks including compressed sensing, inpainting, denoising, and phase retrieval, we show empirically that tunable priors consistently achieve lower reconstruction errors than fixed-complexity baselines. In the linear denoising setting, we provide a theoretical analysis that explicitly characterizes how the optimal tuning parameter depends on noise and model structure. This work demonstrates the potential of tunable-complexity generative priors and motivates both the development of supporting theory and their application across a wide range of inverse problems.
Problem

Research questions and friction points this paper is trying to address.

generative models
inverse problems
model complexity
compressed sensing
overfitting
Innovation

Methods, ideas, or system contributions that make the work stand out.

tunable-complexity priors
nested dropout
generative models
inverse problems
compressed sensing
🔎 Similar Papers
No similar papers found.