Order-Robust Class Incremental Learning: Graph-Driven Dynamic Similarity Grouping

📅 2025-02-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In class-incremental learning (CIL), model performance is highly sensitive to the order in which classes are presented, leading to severe catastrophic forgetting. Method: This paper theoretically establishes for the first time that grouping classes with low inter-class similarity yields stronger order robustness. Building on this insight, we propose a graph-driven dynamic similarity grouping framework: (i) construct a class-similarity graph; (ii) apply graph coloring to derive disjoint, robust class groups; (iii) train independent CIL branches in parallel per group; and (iv) fuse multi-branch predictions at inference. Contribution/Results: Our approach decouples the order-sensitive learning process without increasing inference overhead. It significantly mitigates catastrophic forgetting while improving overall accuracy—achieving state-of-the-art performance in both metrics. This work introduces a novel paradigm for order-robust continual learning.

Technology Category

Application Category

📝 Abstract
Class Incremental Learning (CIL) requires a model to continuously learn new classes without forgetting previously learned ones. While recent studies have significantly alleviated the problem of catastrophic forgetting (CF), more and more research reveals that the order in which classes appear have significant influences on CIL models. Specifically, prioritizing the learning of classes with lower similarity will enhance the model's generalization performance and its ability to mitigate forgetting. Hence, it is imperative to develop an order-robust class incremental learning model that maintains stable performance even when faced with varying levels of class similarity in different orders. In response, we first provide additional theoretical analysis, which reveals that when the similarity among a group of classes is lower, the model demonstrates increased robustness to the class order. Then, we introduce a novel extbf{G}raph- extbf{D}riven extbf{D}ynamic extbf{S}imilarity extbf{G}rouping ( extbf{GDDSG}) method, which leverages a graph coloring algorithm for class-based similarity grouping. The proposed approach trains independent CIL models for each group of classes, ultimately combining these models to facilitate joint prediction. Experimental results demonstrate that our method effectively addresses the issue of class order sensitivity while achieving optimal performance in both model accuracy and anti-forgetting capability. Our code is available at https://github.com/AIGNLAI/GDDSG.
Problem

Research questions and friction points this paper is trying to address.

Order-Robust Class Incremental Learning
Dynamic Similarity Grouping
Mitigating Catastrophic Forgetting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph-Driven Dynamic Similarity Grouping
Leverages graph coloring algorithm
Trains independent CIL models
🔎 Similar Papers
No similar papers found.
Guannan Lai
Guannan Lai
Nanjing University
Continual LearningModel Reuse
Y
Yujie Li
School of Computer and Artificial Intelligence, Southwestern University of Finance and Economics; The Leiden Institute of Advanced Computer Science (LIACS), Leiden University
Xiangkun Wang
Xiangkun Wang
University of Science and Technology
steganographydiffusion model
J
Junbo Zhang
JD Intelligent Cities Research
Tianrui Li
Tianrui Li
School of Computing and Artificial Intelligence, Southwest Jiaotong University
Big Data IntelligenceUrban ComputingGranular Computing
X
Xin Yang
School of Computer and Artificial Intelligence, Southwestern University of Finance and Economics