Deep Neural Regression Collapse

📅 2026-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether neural collapse—a phenomenon originally observed in classification tasks—also occurs in deep neural networks trained for regression, particularly beyond the output layer. Through feature subspace analysis, covariance alignment metrics, linear prediction error evaluation, and ablation studies on weight decay, the work provides the first evidence that Neural Regression Collapse (NRC) is prevalent throughout deep layers, not confined to the final layer alone. The findings reveal that models spontaneously learn low-rank structures aligned with the target dimensionality, that global prediction error can be effectively approximated by local linear errors, and that weight decay plays a pivotal role in inducing NRC in deeper layers. These insights elucidate an intrinsic, geometrically simple structure within deep regression models, significantly extending the current understanding of neural collapse phenomena.

Technology Category

Application Category

📝 Abstract
Neural Collapse is a phenomenon that helps identify sparse and low rank structures in deep classifiers. Recent work has extended the definition of neural collapse to regression problems, albeit only measuring the phenomenon at the last layer. In this paper, we establish that Neural Regression Collapse (NRC) also occurs below the last layer across different types of models. We show that in the collapsed layers of neural regression models, features lie in a subspace that corresponds to the target dimension, the feature covariance aligns with the target covariance, the input subspace of the layer weights aligns with the feature subspace, and the linear prediction error of the features is close to the overall prediction error of the model. In addition to establishing Deep NRC, we also show that models that exhibit Deep NRC learn the intrinsic dimension of low rank targets and explore the necessity of weight decay in inducing Deep NRC. This paper provides a more complete picture of the simple structure learned by deep networks in the context of regression.
Problem

Research questions and friction points this paper is trying to address.

Neural Collapse
Regression
Deep Neural Networks
Low Rank Structure
Feature Subspace
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural Regression Collapse
Deep Neural Networks
Low-rank Structure
Intrinsic Dimension
Weight Decay
🔎 Similar Papers
No similar papers found.