Asymptotic Optimism for Tensor Regression Models with Applications to Neural Network Compression

๐Ÿ“… 2026-03-26
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the reliance on cross-validation for rank selection in low-rank tensor regression by establishing, for the first time under Gaussian random covariate designs, a theoretical connection between the trainingโ€“test error gap (optimism) and the true tensor rank. Leveraging random matrix theory and expected generalization error analysis, the authors derive prediction-oriented rank selection criteria for both CP and Tucker decompositions that eliminate the need for cross-validation. The proposed framework is further extended to tensor model averaging and neural network compression, demonstrating strong empirical performance on image regression tasks: it substantially reduces model complexity while preserving predictive accuracy.
๐Ÿ“ Abstract
We study rank selection for low-rank tensor regression under random covariates design. Under a Gaussian random-design model and some mild conditions, we derive population expressions for the expected training-testing discrepancy (optimism) for both CP and Tucker decomposition. We further demonstrate that the optimism is minimized at the true tensor rank for both CP and Tucker regression. This yields a prediction-oriented rank-selection rule that aligns with cross-validation and extends naturally to tensor-model averaging. We also discuss conditions under which under- or over-ranked models may appear preferable, thereby clarifying the scope of the method. Finally, we showcase its practical utility on a real-world image regression task and extend its application to tensor-based compression of neural network, highlighting its potential for model selection in deep learning.
Problem

Research questions and friction points this paper is trying to address.

tensor regression
rank selection
optimism
model selection
neural network compression
Innovation

Methods, ideas, or system contributions that make the work stand out.

tensor regression
rank selection
optimism
model compression
neural networks
๐Ÿ”Ž Similar Papers
No similar papers found.