🤖 AI Summary
Approximate Bayesian inference often underestimates true uncertainty due to posterior credible intervals that are excessively narrow. This work proposes two simulation-based calibration (SBC)-driven methods for recalibrating approximate posteriors, systematically leveraging the SBC framework to adjust the width of posterior uncertainty intervals and achieve marginal calibration. The approach is applicable to complex model structures, including hierarchical models, and demonstrates consistent efficacy across diverse experimental settings by meaningfully widening posterior intervals. As a result, the proposed recalibration substantially enhances the calibration accuracy and reliability of approximate Bayesian inference.
📝 Abstract
Bayesian inference is often implemented using approximations, which can yield interval estimates that are too narrow, not fully capturing the uncertainty in the posterior distribution. We address the question of how to adjust these approximate posteriors so that they appropriately capture uncertainty. vWe introduce two methods that extend simulation-based calibration checking (SBC) to widen approximate posterior uncertainty intervals to aim for marginal calibration. We demonstrate these methods in several experimental settings, and we discuss the challenge of calibration using posterior inferences and the potential for posterior recalibration of hierarchical models.