Inclusive Training Separation and Implicit Knowledge Interaction for Balanced Online Class-Incremental Learning

📅 2025-04-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the plasticity–stability dilemma in online class-incremental learning (OCIL) caused by knowledge imbalance between old and new classes, this paper proposes a replay-free dual-classifier framework. Methodologically, it introduces (1) an inclusive training separation strategy that abandons conventional exclusive class isolation, and (2) implicit knowledge distillation coupled with progressive feature alignment between the two classifiers to enable dynamic integration of old and new knowledge. The framework employs joint optimization to simultaneously enhance adaptability to incoming classes and retention stability for previously learned classes. Evaluated on three standard OCIL benchmarks, our approach significantly outperforms state-of-the-art methods, achieving more balanced accuracy across old and new classes and improving overall performance by 3.2–5.7 percentage points.

Technology Category

Application Category

📝 Abstract
Online class-incremental learning (OCIL) focuses on gradually learning new classes (called plasticity) from a stream of data in a single-pass, while concurrently preserving knowledge of previously learned classes (called stability). The primary challenge in OCIL lies in maintaining a good balance between the knowledge of old and new classes within the continually updated model. Most existing methods rely on explicit knowledge interaction through experience replay, and often employ exclusive training separation to address bias problems. Nevertheless, it still remains a big challenge to achieve a well-balanced learner, as these methods often exhibit either reduced plasticity or limited stability due to difficulties in continually integrating knowledge in the OCIL setting. In this paper, we propose a novel replay-based method, called Balanced Online Incremental Learning (BOIL), which can achieve both high plasticity and stability, thus ensuring more balanced performance in OCIL. Our BOIL method proposes an inclusive training separation strategy using dual classifiers so that knowledge from both old and new classes can effectively be integrated into the model, while introducing implicit approaches for transferring knowledge across the two classifiers. Extensive experimental evaluations over three widely-used OCIL benchmark datasets demonstrate the superiority of BOIL, showing more balanced yet better performance compared to state-of-the-art replay-based OCIL methods.
Problem

Research questions and friction points this paper is trying to address.

Balancing stability and plasticity in online class-incremental learning
Overcoming bias in knowledge integration for incremental updates
Enhancing implicit knowledge transfer between old and new classes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Inclusive training separation with dual classifiers
Implicit knowledge interaction across classifiers
Balanced plasticity and stability in OCIL
🔎 Similar Papers
No similar papers found.