The Representational Geometry of Number

📅 2026-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether numerical concept representations in language models reside on a shared, unified manifold or are distributed across orthogonal subspaces across different tasks, and uncovers the coexistence and transformation mechanisms between these two regimes. Through high-dimensional geometric analysis of representations, linear subspace decomposition, and cross-task mapping, the authors find that stable geometric relationships among concepts form a shared structural backbone, while task-specific information is encoded along separable linear directions. Crucially, subspaces associated with distinct tasks can be transformed into one another via simple linear mappings. These findings demonstrate that conceptual representations in language models exhibit both structural stability and task-dependent flexibility, with shared structure arising from relational geometry rather than from the concepts themselves.

Technology Category

Application Category

📝 Abstract
A central question in cognitive science is whether conceptual representations converge onto a shared manifold to support generalization, or diverge into orthogonal subspaces to minimize task interference. While prior work has discovered evidence for both, a mechanistic account of how these properties coexist and transform across tasks remains elusive. We propose that representational sharing lies not in the concepts themselves, but in the geometric relations between them. Using number concepts as a testbed and language models as high-dimensional computational substrates, we show that number representations preserve a stable relational structure across tasks. Task-specific representations are embedded in distinct subspaces, with low-level features like magnitude and parity encoded along separable linear directions. Crucially, we find that these subspaces are largely transformable into one another via linear mappings, indicating that representations share relational structure despite being located in distinct subspaces. Together, these results provide a mechanistic lens of how language models balance the shared structure of number representation with functional flexibility. It suggests that understanding arises when task-specific transformations are applied to a shared underlying relational structure of conceptual representations.
Problem

Research questions and friction points this paper is trying to address.

representational geometry
conceptual representations
task interference
relational structure
cognitive science
Innovation

Methods, ideas, or system contributions that make the work stand out.

representational geometry
relational structure
linear mapping
task-specific subspaces
conceptual representations
🔎 Similar Papers
No similar papers found.