A Smooth Transition Between Induction and Deduction: Fast Abductive Learning Based on Probabilistic Symbol Perception

📅 2025-02-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the low conversion efficiency and high computational overhead in transitioning from inductive learning to symbolic deductive reasoning, this paper proposes the Probabilistic Symbolic Perception (PSP) optimization algorithm. PSP establishes a unifying probabilistic modeling framework that bridges continuous numerical prediction and discrete logical inference—marking the first such integration. It introduces an experience-accumulating symbolic relation modeling mechanism to eliminate redundant knowledge base updates, and constructs a joint symbolic–numerical optimization framework incorporating Boolean sequence mapping and efficient data structures. Empirical results demonstrate that PSP significantly reduces inference complexity while preserving logical correctness: it achieves substantial speedup in measured inference latency and maintains robust generalization performance across diverse tasks.

Technology Category

Application Category

📝 Abstract
Abductive learning (ABL) that integrates strengths of machine learning and logical reasoning to improve the learning generalization, has been recently shown effective. However, its efficiency is affected by the transition between numerical induction and symbolical deduction, leading to high computational costs in the worst-case scenario. Efforts on this issue remain to be limited. In this paper, we identified three reasons why previous optimization algorithms for ABL were not effective: insufficient utilization of prediction, symbol relationships, and accumulated experience in successful abductive processes, resulting in redundant calculations to the knowledge base. To address these challenges, we introduce an optimization algorithm named as Probabilistic Symbol Perception (PSP), which makes a smooth transition between induction and deduction and keeps the correctness of ABL unchanged. We leverage probability as a bridge and present an efficient data structure, achieving the transfer from a continuous probability sequence to discrete Boolean sequences with low computational complexity. Experiments demonstrate the promising results.
Problem

Research questions and friction points this paper is trying to address.

Improve efficiency of abductive learning transitions
Reduce computational costs in worst-case scenarios
Optimize prediction and symbol relationship utilization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Probabilistic Symbol Perception algorithm
Efficient data structure usage
Low computational complexity transition
🔎 Similar Papers
No similar papers found.
Lin-Han Jia
Lin-Han Jia
LAMDA Group, Nanjing University
Machine Learning
S
Si-Yu Han
National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China
Lan-Zhe Guo
Lan-Zhe Guo
LAMDA Group, Nanjing University
Machine Learning
Z
Zhi Zhou
National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China
Z
Zhao-Long Li
National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China
Yu-Feng Li
Yu-Feng Li
Professor, Nanjing University
Machine Learning
Zhi-Hua Zhou
Zhi-Hua Zhou
Nanjing University
Artificial IntelligenceMachine LearningData Mining