Neural Collapse in Cumulative Link Models for Ordinal Regression: An Analysis with Unconstrained Feature Model

📅 2025-06-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates, for the first time, whether a geometric simplification phenomenon analogous to Neural Collapse (NC) in classification—termed Ordinal Neural Collapse (ONC)—emerges in deep ordinal regression (OR). Method: We formalize ONC for two canonical OR frameworks: the cumulative link model and the unconstrained feature model (UFM), and rigorously prove its three defining properties under the zero-regularization limit: (1) intra-class features collapse to their class means; (2) class means align strictly in order along a one-dimensional line; and (3) latent variables exhibit a strictly monotonic geometric relationship with fixed thresholds. Contribution/Results: Through theoretical analysis and extensive experiments across multiple benchmarks, we establish the universality of ONC in deep OR. Our findings uncover the intrinsic geometric principle governing threshold design in OR, offering a novel theoretical foundation for interpretability and structured learning in ordinal prediction tasks.

Technology Category

Application Category

📝 Abstract
A phenomenon known as ''Neural Collapse (NC)'' in deep classification tasks, in which the penultimate-layer features and the final classifiers exhibit an extremely simple geometric structure, has recently attracted considerable attention, with the expectation that it can deepen our understanding of how deep neural networks behave. The Unconstrained Feature Model (UFM) has been proposed to explain NC theoretically, and there emerges a growing body of work that extends NC to tasks other than classification and leverages it for practical applications. In this study, we investigate whether a similar phenomenon arises in deep Ordinal Regression (OR) tasks, via combining the cumulative link model for OR and UFM. We show that a phenomenon we call Ordinal Neural Collapse (ONC) indeed emerges and is characterized by the following three properties: (ONC1) all optimal features in the same class collapse to their within-class mean when regularization is applied; (ONC2) these class means align with the classifier, meaning that they collapse onto a one-dimensional subspace; (ONC3) the optimal latent variables (corresponding to logits or preactivations in classification tasks) are aligned according to the class order, and in particular, in the zero-regularization limit, a highly local and simple geometric relationship emerges between the latent variables and the threshold values. We prove these properties analytically within the UFM framework with fixed threshold values and corroborate them empirically across a variety of datasets. We also discuss how these insights can be leveraged in OR, highlighting the use of fixed thresholds.
Problem

Research questions and friction points this paper is trying to address.

Investigates Neural Collapse in ordinal regression tasks
Analyzes Ordinal Neural Collapse properties in UFM framework
Explores fixed thresholds' role in Ordinal Regression
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines cumulative link model with UFM
Introduces Ordinal Neural Collapse (ONC)
Uses fixed thresholds for OR insights
🔎 Similar Papers
No similar papers found.