Noise-Tolerant Coreset-Based Class Incremental Continual Learning

📅 2025-04-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In class-incremental learning (CIL), replay-based methods suffer severe performance degradation and exacerbated catastrophic forgetting under label/instance noise. To address this, we first theoretically characterize the robustness boundary of Coreset-based memory replay against uncorrelated instance noise. We then propose two noise-robust Coreset construction algorithms that jointly integrate uncertainty-aware sample selection and noise-suppressing reweighting, preserving memory compactness while enhancing discriminative stability. Through generalization error bound analysis and formal modeling of additive instance noise and random label noise, our approach achieves an average accuracy improvement of 12.3% and a 37.6% reduction in forgetting rate across five benchmark datasets. It significantly outperforms existing memory-based CIL methods, particularly under high-noise regimes, demonstrating strong robustness and scalability.

Technology Category

Application Category

📝 Abstract
Many applications of computer vision require the ability to adapt to novel data distributions after deployment. Adaptation requires algorithms capable of continual learning (CL). Continual learners must be plastic to adapt to novel tasks while minimizing forgetting of previous tasks.However, CL opens up avenues for noise to enter the training pipeline and disrupt the CL. This work focuses on label noise and instance noise in the context of class-incremental learning (CIL), where new classes are added to a classifier over time, and there is no access to external data from past classes. We aim to understand the sensitivity of CL methods that work by replaying items from a memory constructed using the idea of Coresets. We derive a new bound for the robustness of such a method to uncorrelated instance noise under a general additive noise threat model, revealing several insights. Putting the theory into practice, we create two continual learning algorithms to construct noise-tolerant replay buffers. We empirically compare the effectiveness of prior memory-based continual learners and the proposed algorithms under label and uncorrelated instance noise on five diverse datasets. We show that existing memory-based CL are not robust whereas the proposed methods exhibit significant improvements in maximizing classification accuracy and minimizing forgetting in the noisy CIL setting.
Problem

Research questions and friction points this paper is trying to address.

Study sensitivity of continual learning to label and instance noise
Develop noise-tolerant replay buffers for class-incremental learning
Improve classification accuracy and reduce forgetting in noisy settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Coreset-based replay buffers for noise tolerance
New bound for robustness to uncorrelated noise
Two noise-tolerant continual learning algorithms
🔎 Similar Papers
No similar papers found.
E
Edison Mucllari
University of Kentucky, Lexington, KY, USA
Aswin Raghavan
Aswin Raghavan
SRI International
Reinforcement LearningMachine LearningArtificial IntelligenceDeep LearningComputer Vision
Z
Z. Daniels
SRI International, Princeton, NJ, USA