🤖 AI Summary
To address catastrophic forgetting of historical knowledge and poor learning efficiency for novel classes in Class-Incremental Rehearsal (CIR), this paper proposes a continual learning framework integrating multi-level knowledge distillation and dynamic self-supervised learning. Methodologically: (1) a cross-stage, cross-modal multi-level distillation mechanism is designed to jointly transfer historical knowledge at both feature and logits levels; (2) dynamic weighted self-supervised learning is introduced to adaptively discover discriminative structural patterns from external unlabeled data, thereby balancing stability and plasticity. Evaluated on the CVPR 5th CLVISION Challenge, the method achieves second place, significantly improving long-term model stability and novel-class adaptation performance under CIR settings. This work establishes a scalable paradigm for leveraging internet-scale unlabeled data to enhance continual learning, offering both theoretical insight and practical utility for real-world deployment.
📝 Abstract
Class-incremental with repetition (CIR), where previously trained classes repeatedly introduced in future tasks, is a more realistic scenario than the traditional class incremental setup, which assumes that each task contains unseen classes. CIR assumes that we can easily access abundant unlabeled data from external sources, such as the Internet. Therefore, we propose two components that efficiently use the unlabeled data to ensure the high stability and the plasticity of models trained in CIR setup. First, we introduce multi-level knowledge distillation (MLKD) that distills knowledge from multiple previous models across multiple perspectives, including features and logits, so the model can maintain much various previous knowledge. Moreover, we implement dynamic self-supervised loss (SSL) to utilize the unlabeled data that accelerates the learning of new classes, while dynamic weighting of SSL keeps the focus of training to the primary task. Both of our proposed components significantly improve the performance in CIR setup, achieving 2nd place in the CVPR 5th CLVISION Challenge.