Riemannian Laplace Approximation with the Fisher Metric

📅 2023-11-05
🏛️ International Conference on Artificial Intelligence and Statistics
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the bias and over-concentration of classical Laplace approximation in Bayesian inference—particularly under complex models and limited data—where the Gaussian posterior approximation becomes excessively narrow and inaccurate. We propose a Riemannian-geometric improvement centered on the Fisher information metric, redesigning the curvature-aware metric structure to yield, for the first time, an asymptotically unbiased and exact Laplace approximation in the infinite-data limit. Building on this foundation, we introduce two novel variants that systematically correct biases arising from suboptimal metric choices in existing Riemannian Laplace methods. Our theoretical analysis extends the asymptotic statistical framework, establishing rigorous conditions for consistency and accuracy. Empirical evaluation demonstrates substantial improvements in posterior approximation fidelity and calibration, with robust performance even in finite-sample regimes.
📝 Abstract
Laplace's method approximates a target density with a Gaussian distribution at its mode. It is computationally efficient and asymptotically exact for Bayesian inference due to the Bernstein-von Mises theorem, but for complex targets and finite-data posteriors it is often too crude an approximation. A recent generalization of the Laplace Approximation transforms the Gaussian approximation according to a chosen Riemannian geometry providing a richer approximation family, while still retaining computational efficiency. However, as shown here, its properties depend heavily on the chosen metric, indeed the metric adopted in previous work results in approximations that are overly narrow as well as being biased even at the limit of infinite data. We correct this shortcoming by developing the approximation family further, deriving two alternative variants that are exact at the limit of infinite data, extending the theoretical analysis of the method, and demonstrating practical improvements in a range of experiments.
Problem

Research questions and friction points this paper is trying to address.

Improves Laplace Approximation for complex Bayesian inference.
Addresses bias and narrowness in Riemannian geometry-based approximations.
Develops exact variants for infinite data limit scenarios.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Riemannian geometry for Gaussian approximation
Corrects bias with exact infinite-data variants
Demonstrates practical improvements in experiments
🔎 Similar Papers
No similar papers found.