ACIL: Active Class Incremental Learning for Image Classification

📅 2026-02-04
🏛️ British Machine Vision Conference
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high annotation cost and resource inefficiency in class-incremental learning, where full labeling is typically required at each incremental phase. To mitigate this issue, the study introduces active learning into this setting for the first time, proposing a sample selection mechanism that integrates uncertainty estimation with diversity sampling. By annotating only the most informative samples during each incremental stage, the method substantially reduces labeling requirements while effectively alleviating catastrophic forgetting. Extensive experiments on multiple visual benchmarks demonstrate that the proposed framework achieves superior classification performance compared to existing baselines, using significantly fewer labeled samples, thereby confirming its effectiveness and practicality.

Technology Category

Application Category

📝 Abstract
Continual learning (or class incremental learning) is a realistic learning scenario for computer vision systems, where deep neural networks are trained on episodic data, and the data from previous episodes are generally inaccessible to the model. Existing research in this domain has primarily focused on avoiding catastrophic forgetting, which occurs due to the continuously changing class distributions in each episode and the inaccessibility of the data from previous episodes. However, these methods assume that all the training samples in every episode are annotated; this not only incurs a huge annotation cost, but also results in a wastage of annotation effort, since most of the samples in a given episode will not be accessible to the model in subsequent episodes. Active learning algorithms identify the salient and informative samples from large amounts of unlabeled data and are instrumental in reducing the human annotation effort in inducing a deep neural network. In this paper, we propose ACIL, a novel active learning framework for class incremental learning settings. We exploit a criterion based on uncertainty and diversity to identify the exemplar samples that need to be annotated in each episode, and will be appended to the data in the next episode. Such a framework can drastically reduce annotation cost and can also avoid catastrophic forgetting. Our extensive empirical analyses on several vision datasets corroborate the promise and potential of our framework against relevant baselines.
Problem

Research questions and friction points this paper is trying to address.

class incremental learning
active learning
annotation cost
catastrophic forgetting
image classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

active learning
class incremental learning
catastrophic forgetting
uncertainty sampling
exemplar selection
🔎 Similar Papers
No similar papers found.
A
Aditya R. Bhattacharya
Department of Computer Science, Florida State University
D
Debanjan Goswami
Department of Computer Science, Florida State University
Shayok Chakraborty
Shayok Chakraborty
Researcher
Machine LearningComputer Vision