Towards Robust Incremental Learning under Ambiguous Supervision

📅 2025-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address class ambiguity and catastrophic forgetting caused by label ambiguity and high annotation costs in dynamic scenarios, this paper proposes Incremental Partial Label Learning (IPLL), a novel paradigm that tightly integrates partial label learning with incremental learning for the first time. To jointly tackle the coupled challenges of label uncertainty and knowledge forgetting, we design Prototype-Guided Disambiguation and Replay (PGDR): it achieves accurate disambiguation via prototype-driven pseudo-label initialization and momentum-based optimization; and mitigates forgetting through representative-diverse memory sampling coupled with knowledge distillation-based replay. Extensive experiments on multiple benchmarks demonstrate that IPLL significantly improves disambiguation accuracy on new tasks (+8.2%–14.6%) and substantially alleviates performance degradation on old classes (reducing forgetting rate by 37.5%). These results validate IPLL’s effectiveness and robustness in weakly supervised incremental learning settings.

Technology Category

Application Category

📝 Abstract
Traditional Incremental Learning (IL) targets to handle sequential fully-supervised learning problems where novel classes emerge from time to time. However, due to inherent annotation uncertainty and ambiguity, collecting high-quality annotated data in a dynamic learning system can be extremely expensive. To mitigate this problem, we propose a novel weakly-supervised learning paradigm called Incremental Partial Label Learning (IPLL), where the sequentially arrived data relate to a set of candidate labels rather than the ground truth. Technically, we develop the Prototype-Guided Disambiguation and Replay Algorithm (PGDR) which leverages the class prototypes as a proxy to mitigate two intertwined challenges in IPLL, i.e., label ambiguity and catastrophic forgetting. To handle the former, PGDR encapsulates a momentum-based pseudo-labeling algorithm along with prototype-guided initialization, resulting in a balanced perception of classes. To alleviate forgetting, we develop a memory replay technique that collects well-disambiguated samples while maintaining representativeness and diversity. By jointly distilling knowledge from curated memory data, our framework exhibits a great disambiguation ability for samples of new tasks and achieves less forgetting of knowledge. Extensive experiments demonstrate that PGDR achieves superior
Problem

Research questions and friction points this paper is trying to address.

Incremental Learning
Class Ambiguity
Catastrophic Forgetting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incremental Partial Label Learning
Prototype Guided Deblur and Replay (PGDR) Algorithm
Class Ambiguity Resolution
🔎 Similar Papers
No similar papers found.
R
Rui Wang
Zhejiang University
Mingxuan Xia
Mingxuan Xia
Zhejiang University
Machine Learning
C
Chang Yao
Zhejiang University
L
Lei Feng
Singapore University of Technology and Design
J
Junbo Zhao
Zhejiang University
G
Gang Chen
Zhejiang University
Haobo Wang
Haobo Wang
Zhejiang University
Machine Learning