🤖 AI Summary
For generative models or complex systems with intractable likelihoods, existing likelihood-free inference methods—such as Approximate Bayesian Computation (ABC)—rely heavily on costly simulations, suffering from low efficiency and challenging hyperparameter tuning. This paper introduces Kernel Adaptive Synthetic Posterior Estimation (KASPE), a framework that employs deep neural networks to learn an end-to-end mapping from observed data to the posterior distribution without explicit likelihood evaluation. Grounded in theoretical connections to expectation propagation, KASPE naturally accommodates heavy-tailed and multimodal posteriors. On benchmark tasks—including nonlinear dynamical systems—KASPE achieves substantial improvements over ABC and related approaches in inference accuracy, generalization capability, and computational stability. Once trained, the model enables plug-and-play posterior inference, drastically reducing runtime overhead.
📝 Abstract
Generative models and those with computationally intractable likelihoods are widely used to describe complex systems in the natural sciences, social sciences, and engineering. Fitting these models to data requires likelihood-free inference methods that explore the parameter space without explicit likelihood evaluations, relying instead on sequential simulation, which comes at the cost of computational efficiency and extensive tuning. We develop an alternative framework called kernel-adaptive synthetic posterior estimation (KASPE) that uses deep learning to directly reconstruct the mapping between the observed data and a finite-dimensional parametric representation of the posterior distribution, trained on a large number of simulated datasets. We provide theoretical justification for KASPE and a formal connection to the likelihood-based approach of expectation propagation. Simulation experiments demonstrate KASPE's flexibility and performance relative to existing likelihood-free methods including approximate Bayesian computation in challenging inferential settings involving posteriors with heavy tails, multiple local modes, and over the parameters of a nonlinear dynamical system.