CP4SBI: Local Conformal Calibration of Credible Sets in Simulation-Based Inference

📅 2025-08-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In simulation-based inference (SBI), neural posterior estimators often suffer from miscalibration, leading to inaccurate credible set coverage. To address this, we propose a model-agnostic conformal calibration framework that—uniquely—provides finite-sample, locally Bayesian coverage guarantees for arbitrary scoring functions, including highest posterior density (HPD) regions, symmetric intervals, and quantile-based intervals. Our method integrates conformal prediction with local regression techniques, introducing two novel variants: regression-tree-based local calibration and CDF-guided global–local collaborative calibration. The framework is compatible with mainstream neural density estimators such as normalizing flows and score-based diffusion models. Extensive evaluation across standard SBI benchmarks demonstrates substantial improvements in calibration across diverse posterior estimators, enabling more reliable uncertainty quantification.

Technology Category

Application Category

📝 Abstract
Current experimental scientists have been increasingly relying on simulation-based inference (SBI) to invert complex non-linear models with intractable likelihoods. However, posterior approximations obtained with SBI are often miscalibrated, causing credible regions to undercover true parameters. We develop $ exttt{CP4SBI}$, a model-agnostic conformal calibration framework that constructs credible sets with local Bayesian coverage. Our two proposed variants, namely local calibration via regression trees and CDF-based calibration, enable finite-sample local coverage guarantees for any scoring function, including HPD, symmetric, and quantile-based regions. Experiments on widely used SBI benchmarks demonstrate that our approach improves the quality of uncertainty quantification for neural posterior estimators using both normalizing flows and score-diffusion modeling.
Problem

Research questions and friction points this paper is trying to address.

Calibrating credible sets in simulation-based inference methods
Addressing miscalibration of posterior approximations in SBI
Ensuring local coverage guarantees for credible regions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Conformal calibration framework for credible sets
Local Bayesian coverage via regression trees
CDF-based calibration with finite-sample guarantees
🔎 Similar Papers
No similar papers found.
L
Luben M. C. Cabezas
Department of Statistics, Federal University of São Carlos
V
Vagner S. Santos
Department of Statistics, Federal University of São Carlos
T
Thiago R. Ramos
Department of Statistics, Federal University of São Carlos
P
Pedro L. C. Rodrigues
Univ. Grenoble Alpes, Inria, CNRS, Grenoble INP, LJK
Rafael Izbicki
Rafael Izbicki
Federal University of São Carlos
StatisticsMachine LearningNonparametric MethodsHigh-dimensional InferenceData Science