🤖 AI Summary
Standard Gibbs posteriors suffer from unreliable uncertainty quantification—specifically, frequentist coverage of credible intervals severely deviates from nominal levels and fails to improve with increasing sample size—due to reliance on a single temperature parameter.
Method: We propose a sequential Gibbs posterior framework that incrementally incorporates loss information across stages, enhancing inferential reliability. Methodologically, we integrate nonconvex loss modeling with asymptotic statistical theory on manifolds.
Contributions/Results: Theoretically, we establish the first concentration properties and Bernstein–von Mises (BvM) theorem for sequential Gibbs posteriors; further, we derive the first BvM result for likelihood-based Bayesian posteriors on manifolds. In principal component analysis (PCA), our framework substantially improves frequentist coverage of credible intervals while rigorously guaranteeing asymptotic normality and optimal convergence rates in large samples.
📝 Abstract
Gibbs posteriors are proportional to a prior distribution multiplied by an exponentiated loss function, with a key tuning parameter weighting information in the loss relative to the prior and providing a control of posterior uncertainty. Gibbs posteriors provide a principled framework for likelihood-free Bayesian inference, but in many situations, including a single tuning parameter inevitably leads to poor uncertainty quantification. In particular, regardless of the value of the parameter, credible regions have far from the nominal frequentist coverage even in large samples. We propose a sequential extension to Gibbs posteriors to address this problem. We prove the proposed sequential posterior exhibits concentration and a Bernstein-von Mises theorem, which holds under easy to verify conditions in Euclidean space and on manifolds. As a byproduct, we obtain the first Bernstein-von Mises theorem for traditional likelihood-based Bayesian posteriors on manifolds. All methods are illustrated with an application to principal component analysis.