🤖 AI Summary
This paper addresses ill-posed inverse problems such as image super-resolution and Gaussian deblurring by proposing FMPlug, a plug-and-play framework that enhances the reconstruction capability of pretrained flow matching (FM) priors without domain-specific fine-tuning. Methodologically, FMPlug introduces two key innovations: (1) a time-varying warm-up strategy that dynamically aligns the temporal evolution of observation and generative flows; and (2) sharp Gaussian regularization, which explicitly leverages the inherent Gaussian transition property embedded in flow matching to improve prior fidelity. As a parameter-free, data-free module, FMPlug operates solely on off-the-shelf pretrained FM models—introducing no additional parameters or domain-specific training data. Extensive experiments across multiple standard inverse problem benchmarks demonstrate that FMPlug consistently outperforms state-of-the-art methods built upon base FM models, achieving significant gains in both PSNR and SSIM. These results validate its effectiveness and broad applicability across diverse inverse imaging tasks.
📝 Abstract
We present FMPlug, a novel plug-in framework that enhances foundation flow-matching (FM) priors for solving ill-posed inverse problems. Unlike traditional approaches that rely on domain-specific or untrained priors, FMPlug smartly leverages two simple but powerful insights: the similarity between observed and desired objects and the Gaussianity of generative flows. By introducing a time-adaptive warm-up strategy and sharp Gaussianity regularization, FMPlug unlocks the true potential of domain-agnostic foundation models. Our method beats state-of-the-art methods that use foundation FM priors by significant margins, on image super-resolution and Gaussian deblurring.