A Learnability Analysis on Neuro-Symbolic Learning

πŸ“… 2025-03-21
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the learnability determination problem for neural-symbolic (NeSy) tasks in hybrid systems. We model NeSy tasks as differentiable constraint satisfaction problems (DCSPs) and establish, for the first time, a formal learnability criterion: a task is learnable if and only if its associated DCSP admits a unique solution. Building upon the clustering structure of the hypothesis space, we derive an upper bound on the generalization error for learnable tasks and quantify the asymptotic error’s dependence on the divergence degree of the solution space. Integrating formal learnability theory, DCSP modeling, and statistical learning analysis, our study delivers threefold theoretical contributions: (i) a decidable learnability criterion; (ii) a finite-sample generalization error bound; and (iii) an asymptotic scaling law governing error convergence. Collectively, these results provide a unified theoretical foundation for the design and analysis of NeSy algorithms.

Technology Category

Application Category

πŸ“ Abstract
This paper analyzes the learnability of neuro-symbolic (NeSy) tasks within hybrid systems. We show that the learnability of NeSy tasks can be characterized by their derived constraint satisfaction problems (DCSPs). Specifically, a task is learnable if the corresponding DCSP has a unique solution; otherwise, it is unlearnable. For learnable tasks, we establish error bounds by exploiting the clustering property of the hypothesis space. Additionally, we analyze the asymptotic error for general NeSy tasks, showing that the expected error scales with the disagreement among solutions. Our results offer a principled approach to determining learnability and provide insights into the design of new algorithms.
Problem

Research questions and friction points this paper is trying to address.

Analyzing learnability of neuro-symbolic tasks in hybrid systems
Characterizing learnability via derived constraint satisfaction problems
Establishing error bounds using hypothesis space clustering
Innovation

Methods, ideas, or system contributions that make the work stand out.

Characterizes learnability via derived constraint satisfaction problems
Establishes error bounds using hypothesis space clustering
Analyzes asymptotic error scaling with solution disagreements
πŸ”Ž Similar Papers
No similar papers found.
Hao-Yuan He
Hao-Yuan He
Nanjing University
Machine LearningData Mining
M
Ming Li
National Key Laboratory for Novel Software Technology, Nanjing University, China; School of Artificial Intelligence, Nanjing University, China