Objective Model Prior Probabilities in Variable Selection

📅 2026-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of conventional model priors in high-dimensional Bayesian variable selection, which often exhibit an undue preference for overly complex models, as well as the fundamental shortcomings of Jeffreys’ prior that uniformly allocates mass across model sizes. The authors systematically analyze these deficiencies and propose a novel class of objective priors that achieves a more favorable balance between theoretical rigor and empirical performance. Through comprehensive Bayesian model selection, high-dimensional inference, theoretical analysis, and numerical experiments, the proposed approach demonstrates superior accuracy in variable selection and more effective control of model complexity compared to existing methods. It also exhibits strong robustness and excellent finite-sample behavior, making it a compelling alternative for high-dimensional settings.

Technology Category

Application Category

📝 Abstract
For many years it was routine to use equal model prior probabilities in Bayesian model uncertainty analysis. At least twenty years ago it became clear that this was problematic, leading to support of much too large models in the increasingly huge model spaces being considered in genomics and other fields. A popular replacement was to adopt a suggestion of Harold Jeffreys for the variable selection problem in which a total of $k$ possible variables are being considered for inclusion in the model: give the collection of all models containing $d$ variables ($d = 0, . . . , k$) prior probability $1/(k + 1)$ and then divide this prior probability equally among the models in the collection. Many other choices of model prior probabilities that impose severe parsimony have also been introduced. We begin by reviewing the problems with using equal model prior probabilities and then discuss some serious problems with the Jeffreys choice. Finally, we introduce and study a number of objective alternative choices of model prior probabilities, from both numerical and theoretical perspectives.
Problem

Research questions and friction points this paper is trying to address.

Bayesian model selection
model prior probabilities
variable selection
Jeffreys prior
parsimony
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian model selection
objective prior
variable selection
model prior probabilities
parsimony
🔎 Similar Papers
No similar papers found.
James Berger
James Berger
Professor of Statistics, Duke University
statistics
G
Gonzalo García-Donato
University of Castilla-La Mancha
E
Elías Moreno
Royal Academy of Sciences, Spain
L
Luis Pericchi
University of Puerto Rico, Rio Piedras