๐ค AI Summary
This work proposes the Symbolic Equivariant Recurrent Reasoning Model (SE-RRM), a novel architecture that explicitly embeds symbol permutation equivariance into a recurrent framework to address the challenge of modeling symbolic symmetries in structured reasoning tasks such as Sudoku and ARC-AGI. Unlike prior approaches that rely heavily on data augmentation, SE-RRM incorporates symbolic equivariant layers that guarantee output consistency under symbol or color permutations without any augmentation. With only approximately 2 million parameters, the model achieves strong generalization: it outperforms existing recurrent reasoning models on standard 9ร9 Sudoku and successfully generalizes across scales to 4ร4, 16ร16, and 25ร25 grids. Furthermore, it attains competitive performance on ARC-AGI tasks, demonstrating the effectiveness of explicit symmetry modeling in improving data efficiency and generalization.
๐ Abstract
Reasoning problems such as Sudoku and ARC-AGI remain challenging for neural networks. The structured problem solving architecture family of Recurrent Reasoning Models (RRMs), including Hierarchical Reasoning Model (HRM) and Tiny Recursive Model (TRM), offer a compact alternative to large language models, but currently handle symbol symmetries only implicitly via costly data augmentation. We introduce Symbol-Equivariant Recurrent Reasoning Models (SE-RRMs), which enforce permutation equivariance at the architectural level through symbol-equivariant layers, guaranteeing identical solutions under symbol or color permutations. SE-RRMs outperform prior RRMs on 9x9 Sudoku and generalize from just training on 9x9 to smaller 4x4 and larger 16x16 and 25x25 instances, to which existing RRMs cannot extrapolate. On ARC-AGI-1 and ARC-AGI-2, SE-RRMs achieve competitive performance with substantially less data augmentation and only 2 million parameters, demonstrating that explicitly encoding symmetry improves the robustness and scalability of neural reasoning. Code is available at https://github.com/ml-jku/SE-RRM.