FAB-PPI: Frequentist, Assisted by Bayes, Prediction-Powered Inference

📅 2025-02-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the problem of estimation bias and overly wide confidence intervals in prediction-powered inference (PPI) caused by unstable prediction quality. We propose a novel “frequentist-dominant, Bayesian-assisted” fusion framework that embeds Bayesian prior knowledge into the PPI pipeline. Specifically, it performs adaptive correction guided by priors via prediction error calibration and heavy-tailed distribution modeling. A degradation mechanism is introduced to automatically revert to standard frequentist inference in regions where priors are unreliable, thereby rigorously preserving asymptotic unbiasedness, consistency, and nominal coverage. Theoretical analysis and experiments on both synthetic and real-world datasets demonstrate that our method significantly improves estimation accuracy and yields narrower confidence intervals—particularly under volatile prediction quality—outperforming both conventional PPI and purely Bayesian approaches, while maintaining statistical reliability and practical robustness.

Technology Category

Application Category

📝 Abstract
Prediction-powered inference (PPI) enables valid statistical inference by combining experimental data with machine learning predictions. When a sufficient number of high-quality predictions is available, PPI results in more accurate estimates and tighter confidence intervals than traditional methods. In this paper, we propose to inform the PPI framework with prior knowledge on the quality of the predictions. The resulting method, which we call frequentist, assisted by Bayes, PPI (FAB-PPI), improves over PPI when the observed prediction quality is likely under the prior, while maintaining its frequentist guarantees. Furthermore, when using heavy-tailed priors, FAB-PPI adaptively reverts to standard PPI in low prior probability regions. We demonstrate the benefits of FAB-PPI in real and synthetic examples.
Problem

Research questions and friction points this paper is trying to address.

Enhances statistical inference with prior knowledge
Improves accuracy using prediction quality
Maintains frequentist guarantees in inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines experimental data with predictions
Uses prior knowledge on prediction quality
Adaptively reverts to standard PPI
🔎 Similar Papers
No similar papers found.
Stefano Cortinovis
Stefano Cortinovis
PhD Student, University of Oxford
F
Francois Caron
Department of Statistics, University of Oxford