🤖 AI Summary
This work addresses the lack of theoretical foundations and dedicated methodologies for solving inverse problems in flow-based models. We extend diffusion-based inverse solvers to the ordinary differential equation (ODE)-driven flow matching framework, introducing a novel posterior sampling paradigm for linear inverse problems—including super-resolution, denoising, compressed sensing, and deblurring. Our key contributions are: (1) the first generalization of the Tweedie formula to flow models, enabling decoupled estimation of clean signals and noise trajectories; (2) a plug-and-play sampler requiring no additional training, compatible with both latent-space flow models and Transformer-based architectures; and (3) a hybrid sampling strategy integrating likelihood gradient guidance with controlled stochastic noise injection to enhance stability and reconstruction fidelity. Extensive experiments across four standard benchmarks demonstrate consistent and significant improvements over existing state-of-the-art methods, validating both effectiveness and generalizability.
📝 Abstract
Flow matching is a recent state-of-the-art framework for generative modeling based on ordinary differential equations (ODEs). While closely related to diffusion models, it provides a more general perspective on generative modeling. Although inverse problem solving has been extensively explored using diffusion models, it has not been rigorously examined within the broader context of flow models. Therefore, here we extend the diffusion inverse solvers (DIS) - which perform posterior sampling by combining a denoising diffusion prior with an likelihood gradient - into the flow framework. Specifically, by driving the flow-version of Tweedie's formula, we decompose the flow ODE into two components: one for clean image estimation and the other for noise estimation. By integrating the likelihood gradient and stochastic noise into each component, respectively, we demonstrate that posterior sampling for inverse problem solving can be effectively achieved using flows. Our proposed solver, Flow-Driven Posterior Sampling (FlowDPS), can also be seamlessly integrated into a latent flow model with a transformer architecture. Across four linear inverse problems, we confirm that FlowDPS outperforms state-of-the-art alternatives, all without requiring additional training.