Sampling through iterated approximation: Gradient-free and multi-fidelity Bayesian inference via transport

📅 2026-03-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenging setting of Bayesian inference where the posterior distribution is highly non-Gaussian, gradients are unavailable, and model evaluations are computationally expensive. To tackle this, the authors propose an iterative variational framework that integrates geometric annealing, multi-fidelity modeling, and gradient-free measure transport. By constructing a measure transport surrogate that reuses costly high-fidelity simulations and incorporating importance-weighted multi-set quadrature rules, the method enables efficient posterior sampling and accurate expectation estimation. Demonstrated on low-dimensional but strongly non-Gaussian inverse problems governed by partial differential equations, the approach substantially improves posterior approximation accuracy, yields high-quality independent samples, and achieves a favorable balance between computational efficiency and statistical fidelity.

Technology Category

Application Category

📝 Abstract
We develop an iterative framework for Bayesian inference problems where the posterior distribution may involve computationally intensive models, intractable gradients, significant posterior concentration, and pronounced non-Gaussianity. Our approach integrates: (i) a generalized annealing scheme that combines geometric tempering with multi-fidelity modeling; (ii) expressive measure transport surrogates for the intermediate annealed and final target distributions, learned variationally without evaluating gradients of the target density; and, (iii) an importance-weighting scheme to combine multiple quadrature rules, which recycles and reweighs expensive model evaluations as successive posterior approximations are built. Our scheme produces both a quadrature rule for computing posterior expectations and a transport-based approximation of the posterior from which we can easily generate independent Monte Carlo samples. We demonstrate the efficiency and accuracy of our approach on low-dimensional but strongly non-Gaussian Bayesian inverse problems involving partial differential equations.
Problem

Research questions and friction points this paper is trying to address.

Bayesian inference
intractable gradients
non-Gaussianity
computationally intensive models
posterior concentration
Innovation

Methods, ideas, or system contributions that make the work stand out.

measure transport
gradient-free inference
multi-fidelity modeling
Bayesian annealing
importance weighting
🔎 Similar Papers
No similar papers found.