New (and old) predictive schemes with a.c.i.d. sequences

📅 2025-07-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the limitation of conventional Bayesian inference, which relies on prespecified parametric models and prior distributions. To overcome this, we propose a model-free prediction-inference framework that requires neither an explicit statistical model nor a prior distribution; instead, valid statistical inference is achieved solely through prediction rules satisfying a generalized martingale condition. Methodologically, leveraging the theory of asymptotically conditionally independent and identically distributed (a.c.i.d.) sequences and bootstrap-type resampling mechanisms, we unify heterogeneous prediction approaches—including kernel estimation, parametric Bayesian bootstrap, and copula modeling—within this framework. Crucially, we systematically relax classical assumptions for the first time using generalized martingale theory. Our primary contribution is a substantial expansion of the class of prediction rules that support valid inference, thereby extending the applicability beyond existing model-free Bayesian methods. This advancement enhances flexibility, universality, and theoretical inclusiveness of nonparametric Bayesian inference.

Technology Category

Application Category

📝 Abstract
There is a growing interest in procedures for Bayesian inference that bypass the need to specify a model and prior but simply rely on a predictive rule that describes how we learn on future observations given the available ones. At the heart of the idea is a bootstrap-type scheme that allows us to move from the realm of prediction to that of inference. Which conditions the predictive rule needs to satisfy to produce valid inference is a key question. In this work, we substantially relax previous assumptions building on a generalization of martingales, opening up the possibility of employing a much wider range of predictive rules that were previously ruled out. These include ``old" ideas in Statistics and Learning Theory, such as kernel estimators, and more novel ones, such as the parametric Bayesian bootstrap or copula-based algorithms. Our aim is not to advocate in favor of one predictive rule versus the other ones, but rather to showcase the benefits of working with this larger class of predictive rules.
Problem

Research questions and friction points this paper is trying to address.

Relaxing assumptions for valid Bayesian inference via predictive rules
Exploring broader class of predictive rules for statistical learning
Generalizing martingales to enable diverse inference methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bootstrap-type scheme for predictive inference
Generalized martingales relax previous assumptions
Includes kernel estimators and copula-based algorithms
🔎 Similar Papers
No similar papers found.
M
Marco Battiston
School of Mathematical Sciences, Lancaster University
Lorenzo Cappello
Lorenzo Cappello
Stanford University
Statistics