Enhancing Symbolic Machine Learning by Subsymbolic Representations

📅 2025-06-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inefficiency and limited semantic modeling capability of neuro-symbolic AI in discriminative learning tasks involving numerous constants, this paper proposes a lightweight neural-symbolic integration method. Specifically, it injects differentiable sub-symbolic constant embeddings directly into the similarity predicates of the symbolic learning framework TILDE, enabling joint optimization of embeddings and logical rules under formal logical constraints. This approach achieves, for the first time, end-to-end co-tuning of embeddings and symbolic learners without requiring complex end-to-end architectures, thereby preserving both interpretability and computational efficiency. Evaluated on three real-world tasks, the method significantly outperforms mainstream baselines—including Logic Tensor Networks (LTN) and DeepProbLog—in F1 score. Results demonstrate its simplicity, effectiveness, and strong generalization capability across diverse domains.

Technology Category

Application Category

📝 Abstract
The goal of neuro-symbolic AI is to integrate symbolic and subsymbolic AI approaches, to overcome the limitations of either. Prominent systems include Logic Tensor Networks (LTN) or DeepProbLog, which offer neural predicates and end-to-end learning. The versatility of systems like LTNs and DeepProbLog, however, makes them less efficient in simpler settings, for instance, for discriminative machine learning, in particular in domains with many constants. Therefore, we follow a different approach: We propose to enhance symbolic machine learning schemes by giving them access to neural embeddings. In the present paper, we show this for TILDE and embeddings of constants used by TILDE in similarity predicates. The approach can be fine-tuned by further refining the embeddings depending on the symbolic theory. In experiments in three real-world domain, we show that this simple, yet effective, approach outperforms all other baseline methods in terms of the F1 score. The approach could be useful beyond this setting: Enhancing symbolic learners in this way could be extended to similarities between instances (effectively working like kernels within a logical language), for analogical reasoning, or for propositionalization.
Problem

Research questions and friction points this paper is trying to address.

Integrate symbolic and subsymbolic AI to overcome limitations
Improve efficiency of symbolic learning with neural embeddings
Enhance symbolic learners for analogical reasoning and propositionalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrate symbolic and subsymbolic AI approaches
Enhance symbolic learning with neural embeddings
Fine-tune embeddings based on symbolic theory
🔎 Similar Papers
No similar papers found.
S
Stephen Roth
Johannes Gutenberg Universität Mainz
L
Lennart Baur
Johannes Gutenberg Universität Mainz
D
Derian Boer
Johannes Gutenberg Universität Mainz
Stefan Kramer
Stefan Kramer
Professor of Computer Science, Johannes Gutenberg University Mainz
Data MiningMachine LearningCheminformaticsQSARComputational Sustainability