Confidence sequences with informative, bounded-influence priors

📅 2025-06-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In Gaussian settings with known variance, confidence sequences leveraging prior information suffer from misspecification risk, compromising validity when priors are inaccurate. Method: We propose a novel framework integrating mixture martingales with global-information priors exhibiting polynomial or exponential tails, and extend Ville’s inequality to construct robust confidence sequences. Contribution/Results: Our approach introduces the first “bounded influence” mechanism: when the prior is well-specified, the sequence is substantially tighter than the non-informative counterpart; under severe prior misspecification, its width remains finite, preventing divergence. We prove that the resulting confidence sequences guarantee exact coverage probability for any prior—achieving both statistical efficiency and robustness. This provides a principled, unified paradigm for sequential inference bridging Bayesian and frequentist perspectives.

Technology Category

Application Category

📝 Abstract
Confidence sequences are collections of confidence regions that simultaneously cover the true parameter for every sample size at a prescribed confidence level. Tightening these sequences is of practical interest and can be achieved by incorporating prior information through the method of mixture martingales. However, confidence sequences built from informative priors are vulnerable to misspecification and may become vacuous when the prior is poorly chosen. We study this trade-off for Gaussian observations with known variance. By combining the method of mixtures with a global informative prior whose tails are polynomial or exponential and the extended Ville's inequality, we construct confidence sequences that are sharper than their non-informative counterparts whenever the prior is well specified, yet remain bounded under arbitrary misspecification. The theory is illustrated with several classical priors.
Problem

Research questions and friction points this paper is trying to address.

Develop confidence sequences robust to prior misspecification
Balance tightness and reliability in Gaussian parameter estimation
Combine informative priors with mixture methods for bounded influence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines mixture martingales with informative priors
Uses polynomial or exponential tail priors
Applies extended Ville's inequality for robustness