Implementing Tensor Logic: Unifying Datalog and Neural Reasoning via Tensor Contraction

๐Ÿ“… 2026-01-23
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenge of integrating the reliability of symbolic reasoning with the learning capacity of neural networks to build interpretable and scalable AI reasoning systems. It proposes a unified tensor contraction framework that translates recursive Datalog rules into iterative tensor operations, enabling learnable symbolic reasoning in embedding spaces. The approach provides the first empirical validation of a tensor-based logical formalism, unifying logical rules, relational embeddings ($R_r = E^\top A_r E$), and learnable transformation matrices through Einstein summation. It supports zero-shot compositional reasoning and multi-hop knowledge graph inference, accurately deriving 33,945 ancestral relations on a biblical genealogy graph and achieving MRR scores of 0.3068 (standard) and 0.3346 (compositional) on FB15k-237โ€”demonstrating strong generalization in the absence of direct training examples.

Technology Category

Application Category

๐Ÿ“ Abstract
The unification of symbolic reasoning and neural networks remains a central challenge in artificial intelligence. Symbolic systems offer reliability and interpretability but lack scalability, while neural networks provide learning capabilities but sacrifice transparency. Tensor Logic, proposed by Domingos, suggests that logical rules and Einstein summation are mathematically equivalent, offering a principled path toward unification. This paper provides empirical validation of this framework through three experiments. First, we demonstrate the equivalence between recursive Datalog rules and iterative tensor contractions by computing the transitive closure of a biblical genealogy graph containing 1,972 individuals and 1,727 parent-child relationships, converging in 74 iterations to discover 33,945 ancestor relationships. Second, we implement reasoning in embedding space by training a neural network with learnable transformation matrices, demonstrating successful zero-shot compositional inference on held-out queries. Third, we validate the Tensor Logic superposition construction on FB15k-237, a large-scale knowledge graph with 14,541 entities and 237 relations. Using Domingos's relation matrix formulation $R_r = E^\top A_r E$, we achieve MRR of 0.3068 on standard link prediction and MRR of 0.3346 on a compositional reasoning benchmark where direct edges are removed during training, demonstrating that matrix composition enables multi-hop inference without direct training examples.
Problem

Research questions and friction points this paper is trying to address.

symbolic reasoning
neural networks
unification
tensor logic
compositional inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tensor Logic
tensor contraction
neural-symbolic unification
compositional reasoning
knowledge graph embedding
๐Ÿ”Ž Similar Papers
No similar papers found.