What is a Number, That a Large Language Model May Know It?

📅 2025-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) exhibit a fundamental tension in numeracy: they encode numbers as tokenized strings while simultaneously attempting to model their numerical semantics, leading to representational entanglement and downstream reasoning biases. Method: We conduct a systematic investigation—including cognitively inspired similarity prompting, embedding geometric analysis, controlled contextual ablation, and transfer evaluation on real-world decision-making tasks—to characterize the structure of number representations in LLMs. Contribution/Results: We discover, for the first time, that LLM number embeddings form an intrinsically entangled space—simultaneously reflecting Levenshtein string distance and log-scaled numerical distance. This entanglement is pervasive across models and resists full disentanglement via standard interventions. It demonstrably impairs numerical reasoning accuracy and decision consistency. Our findings provide novel evidence for a core symbol–semantics alignment bottleneck in LLMs and inform principled approaches to numeracy-aware representation learning and robustness enhancement.

Technology Category

Application Category

📝 Abstract
Numbers are a basic part of how humans represent and describe the world around them. As a consequence, learning effective representations of numbers is critical for the success of large language models as they become more integrated into everyday decisions. However, these models face a challenge: depending on context, the same sequence of digit tokens, e.g., 911, can be treated as a number or as a string. What kind of representations arise from this duality, and what are its downstream implications? Using a similarity-based prompting technique from cognitive science, we show that LLMs learn representational spaces that blend string-like and numerical representations. In particular, we show that elicited similarity judgments from these models over integer pairs can be captured by a combination of Levenshtein edit distance and numerical Log-Linear distance, suggesting an entangled representation. In a series of experiments we show how this entanglement is reflected in the latent embeddings, how it can be reduced but not entirely eliminated by context, and how it can propagate into a realistic decision scenario. These results shed light on a representational tension in transformer models that must learn what a number is from text input.
Problem

Research questions and friction points this paper is trying to address.

Large Language Models
Numeric Understanding
Context-dependent Interpretation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Large Language Models
Numerical Understanding
Contextual Influence
🔎 Similar Papers
No similar papers found.