FoNE: Precise Single-Token Number Embeddings via Fourier Features

📅 2025-02-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) represent numbers using multi-token subword or digit-wise encodings, leading to weak numerical reasoning and suboptimal training/inference efficiency. To address this, we propose Fourier Number Embedding (FoNE), the first method enabling exact, single-token representation of arbitrary integers—each decimal digit encoded via only two dimensions of sine-cosine Fourier features. Inspired by intrinsic frequency patterns observed in pretrained LLMs, FoNE supports both learnable and fixed frequencies and integrates seamlessly into standard Transformer architectures without architectural modification. On 6-digit decimal addition, FoNE reduces the data requirement for 99% accuracy by 64× compared to baselines, while cutting token consumption by 3× versus subword tokenization and 6× versus digit-wise encoding. Moreover, on arithmetic tasks (addition, subtraction, multiplication) exceeding 100,000 samples, FoNE achieves 100% accuracy—the first method to do so.

Technology Category

Application Category

📝 Abstract
Large Language Models (LLMs) typically represent numbers using multiple tokens, which requires the model to aggregate these tokens to interpret numerical values. This fragmentation makes both training and inference less efficient and adversely affects the model's performance on number-related tasks. Inspired by the observation that pre-trained LLMs internally learn Fourier-like features for number tokens, we propose Fourier Number Embedding (FoNE), a novel method that directly maps numbers into the embedding space with their Fourier features. FoNE encodes each number as a single token with only two embedding dimensions per digit, effectively capturing numerical values without fragmentation. This compact representation accelerates both training and inference. Compared to traditional subword and digit-wise embeddings, FoNE not only reduces computational overhead but also achieves higher accuracy across various numerical tasks including addition, subtraction and multiplication. On 6-digit decimal addition, FoNE requires 64$ imes$ less data to achieve 99% accuracy than subword and digit-wise embeddings while using 3$ imes$ and 6$ imes$ fewer tokens per number, respectively. Furthermore, FoNE is the only method that yields 100% accuracy on over 100,000 test examples for addition, subtraction, and multiplication. The codes and visualization are available at https://fouriernumber.github.io/.
Problem

Research questions and friction points this paper is trying to address.

Improves numerical tasks in LLMs
Reduces token fragmentation for numbers
Enhances training and inference efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Single-token Fourier number embedding
Compact two-dimensional encoding per digit
Enhanced accuracy in numerical tasks
🔎 Similar Papers
No similar papers found.