🤖 AI Summary
In simulation-based inference (SBI), neural posterior estimators often suffer from miscalibration, leading to inaccurate credible set coverage. To address this, we propose a model-agnostic conformal calibration framework that—uniquely—provides finite-sample, locally Bayesian coverage guarantees for arbitrary scoring functions, including highest posterior density (HPD) regions, symmetric intervals, and quantile-based intervals. Our method integrates conformal prediction with local regression techniques, introducing two novel variants: regression-tree-based local calibration and CDF-guided global–local collaborative calibration. The framework is compatible with mainstream neural density estimators such as normalizing flows and score-based diffusion models. Extensive evaluation across standard SBI benchmarks demonstrates substantial improvements in calibration across diverse posterior estimators, enabling more reliable uncertainty quantification.
📝 Abstract
Current experimental scientists have been increasingly relying on simulation-based inference (SBI) to invert complex non-linear models with intractable likelihoods. However, posterior approximations obtained with SBI are often miscalibrated, causing credible regions to undercover true parameters. We develop $ exttt{CP4SBI}$, a model-agnostic conformal calibration framework that constructs credible sets with local Bayesian coverage. Our two proposed variants, namely local calibration via regression trees and CDF-based calibration, enable finite-sample local coverage guarantees for any scoring function, including HPD, symmetric, and quantile-based regions. Experiments on widely used SBI benchmarks demonstrate that our approach improves the quality of uncertainty quantification for neural posterior estimators using both normalizing flows and score-diffusion modeling.