🤖 AI Summary
In class-incremental learning (CIL), model performance is highly sensitive to the order in which classes are presented, leading to severe catastrophic forgetting. Method: This paper theoretically establishes for the first time that grouping classes with low inter-class similarity yields stronger order robustness. Building on this insight, we propose a graph-driven dynamic similarity grouping framework: (i) construct a class-similarity graph; (ii) apply graph coloring to derive disjoint, robust class groups; (iii) train independent CIL branches in parallel per group; and (iv) fuse multi-branch predictions at inference. Contribution/Results: Our approach decouples the order-sensitive learning process without increasing inference overhead. It significantly mitigates catastrophic forgetting while improving overall accuracy—achieving state-of-the-art performance in both metrics. This work introduces a novel paradigm for order-robust continual learning.
📝 Abstract
Class Incremental Learning (CIL) requires a model to continuously learn new classes without forgetting previously learned ones. While recent studies have significantly alleviated the problem of catastrophic forgetting (CF), more and more research reveals that the order in which classes appear have significant influences on CIL models. Specifically, prioritizing the learning of classes with lower similarity will enhance the model's generalization performance and its ability to mitigate forgetting. Hence, it is imperative to develop an order-robust class incremental learning model that maintains stable performance even when faced with varying levels of class similarity in different orders. In response, we first provide additional theoretical analysis, which reveals that when the similarity among a group of classes is lower, the model demonstrates increased robustness to the class order. Then, we introduce a novel extbf{G}raph- extbf{D}riven extbf{D}ynamic extbf{S}imilarity extbf{G}rouping ( extbf{GDDSG}) method, which leverages a graph coloring algorithm for class-based similarity grouping. The proposed approach trains independent CIL models for each group of classes, ultimately combining these models to facilitate joint prediction. Experimental results demonstrate that our method effectively addresses the issue of class order sensitivity while achieving optimal performance in both model accuracy and anti-forgetting capability. Our code is available at https://github.com/AIGNLAI/GDDSG.