Calibrated Principal Component Regression

๐Ÿ“… 2025-10-21
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
In high-dimensional generalized linear models, standard principal component regression (PCR) suffers from a biasโ€“variance trade-off imbalance under overparameterization due to hard truncation of principal components. To address this, we propose a calibrated PCR method: first, learn a low-variance prior in the principal component subspace; then, perform soft calibration via centered Tikhonov regularization in the original feature space to mitigate truncation-induced bias. Our approach integrates principal component analysis (PCA), cross-fitting, and random matrix theory, yielding provable out-of-sample risk guarantees. Theoretical analysis establishes that its asymptotic prediction risk strictly dominates that of conventional PCR. Empirical evaluation across multiple overparameterized tasks demonstrates superior predictive accuracy, enhanced stability, and improved generalization adaptability compared to baseline methods.

Technology Category

Application Category

๐Ÿ“ Abstract
We propose a new method for statistical inference in generalized linear models. In the overparameterized regime, Principal Component Regression (PCR) reduces variance by projecting high-dimensional data to a low-dimensional principal subspace before fitting. However, PCR incurs truncation bias whenever the true regression vector has mass outside the retained principal components (PC). To mitigate the bias, we propose Calibrated Principal Component Regression (CPCR), which first learns a low-variance prior in the PC subspace and then calibrates the model in the original feature space via a centered Tikhonov step. CPCR leverages cross-fitting and controls the truncation bias by softening PCR's hard cutoff. Theoretically, we calculate the out-of-sample risk in the random matrix regime, which shows that CPCR outperforms standard PCR when the regression signal has non-negligible components in low-variance directions. Empirically, CPCR consistently improves prediction across multiple overparameterized problems. The results highlight CPCR's stability and flexibility in modern overparameterized settings.
Problem

Research questions and friction points this paper is trying to address.

Mitigating truncation bias in overparameterized regression models
Improving prediction accuracy via calibrated principal component analysis
Addressing variance-bias tradeoff in high-dimensional statistical inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Calibrated PCR softens hard cutoff to reduce bias
Learns low-variance prior in principal component subspace
Calibrates model via centered Tikhonov regularization step
๐Ÿ”Ž Similar Papers
No similar papers found.