A tensor network formalism for neuro-symbolic AI

πŸ“… 2026-01-21
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the long-standing challenge in neuro-symbolic artificial intelligence of integrating logical reasoning and probabilistic learning due to the absence of a unified mathematical framework. The authors propose a novel formalism based on tensor networks, wherein logical formulas and probability distributions are jointly represented as structured tensor decompositions. For the first time, tensor contraction is employed as a universal inference mechanism, enabling the construction of Hybrid Logic Networkβ€”a trainable hybrid logic-probabilistic model. Through tensor decomposition, basis encoding, contraction-based message passing algorithms, and a custom Python library (tnreason), the framework achieves a unified and efficient computational treatment of both logical and probabilistic reasoning. This approach establishes a scalable, differentiable paradigm for neuro-symbolic systems, bridging symbolic expressiveness with data-driven learning.

Technology Category

Application Category

πŸ“ Abstract
The unification of neural and symbolic approaches to artificial intelligence remains a central open challenge. In this work, we introduce a tensor network formalism, which captures sparsity principles originating in the different approaches in tensor decompositions. In particular, we describe a basis encoding scheme for functions and model neural decompositions as tensor decompositions. The proposed formalism can be applied to represent logical formulas and probability distributions as structured tensor decompositions. This unified treatment identifies tensor network contractions as a fundamental inference class and formulates efficiently scaling reasoning algorithms, originating from probability theory and propositional logic, as contraction message passing schemes. The framework enables the definition and training of hybrid logical and probabilistic models, which we call Hybrid Logic Network. The theoretical concepts are accompanied by the python library tnreason, which enables the implementation and practical use of the proposed architectures.
Problem

Research questions and friction points this paper is trying to address.

neuro-symbolic AI
tensor networks
neural-symbolic integration
logical reasoning
probabilistic inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

tensor network
neuro-symbolic AI
tensor decomposition
logical reasoning
probabilistic inference
πŸ”Ž Similar Papers
No similar papers found.
A
Alex Goessmann
Weierstrass Institute of Applied Analysis and Stochastics
J
Janina Schutte
Weierstrass Institute of Applied Analysis and Stochastics
M
Maximilian Frohlich
Weierstrass Institute of Applied Analysis and Stochastics
Martin Eigel
Martin Eigel
Weierstrass Institute for Applied Analysis and Stochastics Berlin
uncertainty quantificationtensor methodsscientific machine learningquantum computing