Learning neuro-symbolic convergent term rewriting systems

📅 2025-07-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural-symbolic systems struggle with algorithm-level strong generalization and out-of-distribution reasoning. Method: We propose a neurosymbolic architecture inspired by term rewriting theory, introducing modular, end-to-end trainable models—NRS and FastNRS—that explicitly model symbolic algorithm execution as a convergent rewriting process, enabling unified learning and inference for complex symbolic tasks such as mathematical expression simplification. Our approach integrates neural computational efficiency with symbolic structural priors. Contribution/Results: The models achieve substantial improvements in memory efficiency, training speed, and inference latency. On four mathematical simplification benchmarks, they comprehensively outperform Neural Data Router and GPT-4o; under multi-task joint learning, their performance matches or exceeds that of OpenAI o1-preview. These results demonstrate robust algorithm-level generalization and cross-task transfer capability.

Technology Category

Application Category

📝 Abstract
Building neural systems that can learn to execute symbolic algorithms is a challenging open problem in artificial intelligence, especially when aiming for strong generalization and out-of-distribution performance. In this work, we introduce a general framework for learning convergent term rewriting systems using a neuro-symbolic architecture inspired by the rewriting algorithm itself. We present two modular implementations of such architecture: the Neural Rewriting System (NRS) and the Fast Neural Rewriting System (FastNRS). As a result of algorithmic-inspired design and key architectural elements, both models can generalize to out-of-distribution instances, with FastNRS offering significant improvements in terms of memory efficiency, training speed, and inference time. We evaluate both architectures on four tasks involving the simplification of mathematical formulas and further demonstrate their versatility in a multi-domain learning scenario, where a single model is trained to solve multiple types of problems simultaneously. The proposed system significantly outperforms two strong neural baselines: the Neural Data Router, a recent transformer variant specifically designed to solve algorithmic problems, and GPT-4o, one of the most powerful general-purpose large-language models. Moreover, our system matches or outperforms the latest o1-preview model from OpenAI that excels in reasoning benchmarks.
Problem

Research questions and friction points this paper is trying to address.

Learning symbolic algorithms with neural systems
Generalizing to out-of-distribution instances efficiently
Simplifying mathematical formulas across multiple domains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neuro-symbolic architecture for term rewriting
Modular Neural and FastNRS implementations
Outperforms Neural Data Router and GPT-4o
🔎 Similar Papers
No similar papers found.