Restoring Forgotten Knowledge in Non-Exemplar Class Incremental Learning through Test-Time Semantic Evolution

๐Ÿ“… 2025-03-21
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Catastrophic forgetting in non-exemplar class-incremental learning (NECIL) arises due to the unavailability of old-class samples during incremental phases. Method: This work pioneers a test-time knowledge recovery paradigm, introducing a semantic evolution mechanism that requires neither old samples nor model retraining. It establishes a test-time semantic drift compensation framework: self-supervised modeling of semantic shift, analytical optimization (replacing online gradient updates), and dynamic feature calibrationโ€”all performed without rehearsal or access to historical data. Contribution/Results: The approach achieves state-of-the-art performance on CIFAR-100, TinyImageNet, and ImageNet100, significantly improving old-class accuracy under both cold-start and warm-start settings. By decoupling knowledge retention from data replay and parameter fine-tuning, it enhances both stability and plasticity, establishing a novel paradigm for NECIL.

Technology Category

Application Category

๐Ÿ“ Abstract
Continual learning aims to accumulate knowledge over a data stream while mitigating catastrophic forgetting. In Non-exemplar Class Incremental Learning (NECIL), forgetting arises during incremental optimization because old classes are inaccessible, hindering the retention of prior knowledge. To solve this, previous methods struggle in achieving the stability-plasticity balance in the training stages. However, we note that the testing stage is rarely considered among them, but is promising to be a solution to forgetting. Therefore, we propose RoSE, which is a simple yet effective method that extbf{R}est extbf{o}res forgotten knowledge through test-time extbf{S}emantic extbf{E}volution. Specifically designed for minimizing forgetting, RoSE is a test-time semantic drift compensation framework that enables more accurate drift estimation in a self-supervised manner. Moreover, to avoid incomplete optimization during online testing, we derive an analytical solution as an alternative to gradient descent. We evaluate RoSE on CIFAR-100, TinyImageNet, and ImageNet100 datasets, under both cold-start and warm-start settings. Our method consistently outperforms most state-of-the-art (SOTA) methods across various scenarios, validating the potential and feasibility of test-time evolution in NECIL.
Problem

Research questions and friction points this paper is trying to address.

Mitigates catastrophic forgetting in continual learning
Addresses old class inaccessibility in NECIL
Proposes test-time semantic evolution for knowledge retention
Innovation

Methods, ideas, or system contributions that make the work stand out.

Test-time semantic drift compensation framework
Self-supervised accurate drift estimation
Analytical solution replaces gradient descent
๐Ÿ”Ž Similar Papers
No similar papers found.