Likelihood-free Posterior Density Learning for Uncertainty Quantification in Inference Problems

📅 2025-07-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
For generative models or complex systems with intractable likelihoods, existing likelihood-free inference methods—such as Approximate Bayesian Computation (ABC)—rely heavily on costly simulations, suffering from low efficiency and challenging hyperparameter tuning. This paper introduces Kernel Adaptive Synthetic Posterior Estimation (KASPE), a framework that employs deep neural networks to learn an end-to-end mapping from observed data to the posterior distribution without explicit likelihood evaluation. Grounded in theoretical connections to expectation propagation, KASPE naturally accommodates heavy-tailed and multimodal posteriors. On benchmark tasks—including nonlinear dynamical systems—KASPE achieves substantial improvements over ABC and related approaches in inference accuracy, generalization capability, and computational stability. Once trained, the model enables plug-and-play posterior inference, drastically reducing runtime overhead.

Technology Category

Application Category

📝 Abstract
Generative models and those with computationally intractable likelihoods are widely used to describe complex systems in the natural sciences, social sciences, and engineering. Fitting these models to data requires likelihood-free inference methods that explore the parameter space without explicit likelihood evaluations, relying instead on sequential simulation, which comes at the cost of computational efficiency and extensive tuning. We develop an alternative framework called kernel-adaptive synthetic posterior estimation (KASPE) that uses deep learning to directly reconstruct the mapping between the observed data and a finite-dimensional parametric representation of the posterior distribution, trained on a large number of simulated datasets. We provide theoretical justification for KASPE and a formal connection to the likelihood-based approach of expectation propagation. Simulation experiments demonstrate KASPE's flexibility and performance relative to existing likelihood-free methods including approximate Bayesian computation in challenging inferential settings involving posteriors with heavy tails, multiple local modes, and over the parameters of a nonlinear dynamical system.
Problem

Research questions and friction points this paper is trying to address.

Estimating posterior density without likelihood evaluations
Improving computational efficiency in likelihood-free inference
Handling complex posteriors with heavy tails or multiple modes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep learning reconstructs posterior mapping directly
Kernel-adaptive synthetic posterior estimation framework
Handles heavy tails and multiple modes
🔎 Similar Papers
R
Rui Zhang
Department of Statistics, The Ohio State University
O
Oksana A. Chkrebtii
Department of Statistics, The Ohio State University
Dongbin Xiu
Dongbin Xiu
Professor of Mathematics, The Ohio State University
applied and computational mathematicsuncertainty quantification