Symbol-Equivariant Recurrent Reasoning Models

๐Ÿ“… 2026-03-02
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work proposes the Symbolic Equivariant Recurrent Reasoning Model (SE-RRM), a novel architecture that explicitly embeds symbol permutation equivariance into a recurrent framework to address the challenge of modeling symbolic symmetries in structured reasoning tasks such as Sudoku and ARC-AGI. Unlike prior approaches that rely heavily on data augmentation, SE-RRM incorporates symbolic equivariant layers that guarantee output consistency under symbol or color permutations without any augmentation. With only approximately 2 million parameters, the model achieves strong generalization: it outperforms existing recurrent reasoning models on standard 9ร—9 Sudoku and successfully generalizes across scales to 4ร—4, 16ร—16, and 25ร—25 grids. Furthermore, it attains competitive performance on ARC-AGI tasks, demonstrating the effectiveness of explicit symmetry modeling in improving data efficiency and generalization.

Technology Category

Application Category

๐Ÿ“ Abstract
Reasoning problems such as Sudoku and ARC-AGI remain challenging for neural networks. The structured problem solving architecture family of Recurrent Reasoning Models (RRMs), including Hierarchical Reasoning Model (HRM) and Tiny Recursive Model (TRM), offer a compact alternative to large language models, but currently handle symbol symmetries only implicitly via costly data augmentation. We introduce Symbol-Equivariant Recurrent Reasoning Models (SE-RRMs), which enforce permutation equivariance at the architectural level through symbol-equivariant layers, guaranteeing identical solutions under symbol or color permutations. SE-RRMs outperform prior RRMs on 9x9 Sudoku and generalize from just training on 9x9 to smaller 4x4 and larger 16x16 and 25x25 instances, to which existing RRMs cannot extrapolate. On ARC-AGI-1 and ARC-AGI-2, SE-RRMs achieve competitive performance with substantially less data augmentation and only 2 million parameters, demonstrating that explicitly encoding symmetry improves the robustness and scalability of neural reasoning. Code is available at https://github.com/ml-jku/SE-RRM.
Problem

Research questions and friction points this paper is trying to address.

symbol symmetry
neural reasoning
permutation equivariance
Sudoku
ARC-AGI
Innovation

Methods, ideas, or system contributions that make the work stand out.

symbol-equivariance
recurrent reasoning models
permutation equivariance
neural reasoning
data-efficient generalization
๐Ÿ”Ž Similar Papers
No similar papers found.