The Pivotal Information Criterion

📅 2026-03-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of traditional information criteria—such as AIC and BIC—in high-dimensional settings, where they often yield excessive false selections and suffer from intractable discrete optimization. The authors propose a novel criterion, PIC, formulated as a continuous optimization problem. By leveraging the asymptotic pivotal statistic under pure noise, PIC adaptively calibrates its penalty parameter to precisely target the phase transition boundary in model selection. The method achieves substantially improved variable selection accuracy while maintaining competitive predictive performance. Theoretical analysis reveals a sharp probability phase transition for exact support recovery, and both simulation studies and real-data experiments demonstrate that PIC consistently selects sparser and more accurate models.

Technology Category

Application Category

📝 Abstract
The Bayesian and Akaike information criteria aim at finding a good balance between under- and over-fitting. They are extensively used every day by practitioners. Yet we contend they suffer from at least two afflictions: their penalty parameter $λ=\log n$ and $λ=2$ are too small, leading to many false discoveries, and their inherent (best subset) discrete optimization is infeasible in high dimension. We alleviate these issues with the pivotal information criterion: PIC is defined as a continuous optimization problem, and the PIC penalty parameter $λ$ is selected at the detection boundary (under pure noise). PIC's choice of $λ$ is the quantile of a statistic that we prove to be (asymptotically) pivotal, provided the loss function is appropriately transformed. As a result, simulations show a phase transition in the probability of exact support recovery with PIC, a phenomenon studied with no noise in compressed sensing. Applied on real data, for similar predictive performances, PIC selects the least complex model among state-of-the-art learners.
Problem

Research questions and friction points this paper is trying to address.

information criteria
false discoveries
high-dimensional optimization
model selection
penalty parameter
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pivotal Information Criterion
continuous optimization
penalty parameter selection
exact support recovery
asymptotically pivotal
🔎 Similar Papers
No similar papers found.