Amortized Posterior Sampling with Diffusion Prior Distillation

📅 2024-07-25
🏛️ arXiv.org
📈 Citations: 4
Influential: 0
📄 PDF
🤖 AI Summary
To address the low sampling efficiency and insufficient diversity in posterior sampling for inverse problems—including image restoration, manifold signal reconstruction, and climate data imputation—this paper proposes a conditional normalizing flow framework grounded in diffusion prior distillation. Methodologically, it introduces the first approach to distill the implicit prior encoded in a pre-trained diffusion model into a conditional flow model capable of one-step sampling, achieved via variational inference for implicit posterior modeling; the framework operates natively on both Euclidean spaces and manifolds and exhibits measurement adaptivity. The key contribution is establishing a novel amortized posterior sampling paradigm that requires only a single function evaluation (1 NFE), drastically accelerating posterior inference while preserving high-fidelity reconstructions and strong cross-task generalization.

Technology Category

Application Category

📝 Abstract
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems. From a pre-trained diffusion model, our approach trains a conditional flow model to minimize the divergence between the proposal variational distribution and the posterior distribution implicitly defined through the diffusion model. Once trained, the flow model is capable of sampling from the posterior distribution with a single NFE, amortized with respect to the measurement. The proposed method paves a new path for distilling a diffusion prior for efficient posterior sampling. We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
Problem

Research questions and friction points this paper is trying to address.

Efficient posterior sampling in inverse problems
Unsupervised variational inference without paired data
Generalized sampling across Euclidean and non-Euclidean domains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Amortized Posterior Sampling for efficient inference
Unsupervised conditional flow model training
Single neural evaluation for diverse samples