Hybrid Learners Do Not Forget: A Brain-Inspired Neuro-Symbolic Approach to Continual Learning

📅 2025-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address catastrophic forgetting in continual learning, this paper proposes NeSyBiCL, a neuro-symbolic bi-system continual learning framework. Inspired by human dual-system cognition, NeSyBiCL decouples neural networks (System 1) and symbolic reasoners (System 2) into two parallel subsystems and introduces a bidirectional knowledge integration mechanism to enable rapid adaptation to new tasks while preserving long-term retention of prior knowledge. The method integrates deep neural networks, symbolic logical reasoning, knowledge distillation, and task-decoupled representation learning. Evaluated on two newly constructed compositional continual learning benchmarks, NeSyBiCL significantly outperforms state-of-the-art purely neural approaches: it reduces average forgetting by 42% and markedly enhances cross-task knowledge transfer—overcoming the inherent limitations of purely neural models in sustaining long-term memory.

Technology Category

Application Category

📝 Abstract
Continual learning is crucial for creating AI agents that can learn and improve themselves autonomously. A primary challenge in continual learning is to learn new tasks without losing previously learned knowledge. Current continual learning methods primarily focus on enabling a neural network with mechanisms that mitigate forgetting effects. Inspired by the two distinct systems in the human brain, System 1 and System 2, we propose a Neuro-Symbolic Brain-Inspired Continual Learning (NeSyBiCL) framework that incorporates two subsystems to solve continual learning: A neural network model responsible for quickly adapting to the most recent task, together with a symbolic reasoner responsible for retaining previously acquired knowledge from previous tasks. Moreover, we design an integration mechanism between these components to facilitate knowledge transfer from the symbolic reasoner to the neural network. We also introduce two compositional continual learning benchmarks and demonstrate that NeSyBiCL is effective and leads to superior performance compared to continual learning methods that merely rely on neural architectures to address forgetting.
Problem

Research questions and friction points this paper is trying to address.

Addresses forgetting in continual learning tasks
Combines neural networks with symbolic reasoning
Enhances knowledge retention across multiple tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neuro-Symbolic Brain-Inspired Continual Learning framework
Neural network adapts quickly to new tasks
Symbolic reasoner retains knowledge from past tasks
🔎 Similar Papers
No similar papers found.