🤖 AI Summary
Conventional model selection criteria (e.g., AIC, BIC, R²) suffer from poor robustness and inconsistent variable selection in high-dimensional regression where $p gg n$.
Method: We propose a novel criterion that simultaneously ensures statistical robustness and asymptotic consistency. Our approach uniquely integrates adaptive truncation, stable subsampling, and debiased covariance estimation, grounded in extreme value theory and high-dimensional M-estimation theory to establish a unified framework with dual guarantees—robustness against contamination and consistency under sparsity.
Contribution/Results: We prove that the proposed criterion achieves a convergence rate of $O_P(n^{-1/2})$, substantially outperforming existing benchmarks. Extensive simulations and real-data analyses demonstrate an average 18.7% improvement in model selection accuracy, effectively mitigating the impacts of heavy-tailed noise, the curse of dimensionality, and sample bias.