L3A: Label-Augmented Analytic Adaptation for Multi-Label Class Incremental Learning

πŸ“… 2025-06-01
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Multi-label class-incremental learning (MLCIL) suffers from both label incompleteness and severe class imbalance, leading to incomplete historical knowledge retention and model bias toward long-tailed distributions. To address these challenges without exemplar replay, we propose Label-enhanced Learning and Linear Adaptation (L3A), a sample-free framework comprising two core components: (1) a pseudo-label generation module to mitigate label scarcity during incremental phases; and (2) a Weighted Analytic Classifier (WAC), which jointly models label incompleteness and long-tailed class distribution via sample-level dynamic weighting, yielding a closed-form solution under exemplar-free settingsβ€”the first such approach in MLCIL. L3A integrates multi-label loss modeling with closed-form neural network adaptation, significantly improving both knowledge retention and novel-class recognition. Extensive experiments demonstrate state-of-the-art performance on MS-COCO and PASCAL VOC. The code is publicly available.

Technology Category

Application Category

πŸ“ Abstract
Class-incremental learning (CIL) enables models to learn new classes continually without forgetting previously acquired knowledge. Multi-label CIL (MLCIL) extends CIL to a real-world scenario where each sample may belong to multiple classes, introducing several challenges: label absence, which leads to incomplete historical information due to missing labels, and class imbalance, which results in the model bias toward majority classes. To address these challenges, we propose Label-Augmented Analytic Adaptation (L3A), an exemplar-free approach without storing past samples. L3A integrates two key modules. The pseudo-label (PL) module implements label augmentation by generating pseudo-labels for current phase samples, addressing the label absence problem. The weighted analytic classifier (WAC) derives a closed-form solution for neural networks. It introduces sample-specific weights to adaptively balance the class contribution and mitigate class imbalance. Experiments on MS-COCO and PASCAL VOC datasets demonstrate that L3A outperforms existing methods in MLCIL tasks. Our code is available at https://github.com/scut-zx/L3A.
Problem

Research questions and friction points this paper is trying to address.

Addresses label absence in multi-label incremental learning
Mitigates class imbalance in incremental learning scenarios
Proposes exemplar-free approach for continual learning without forgetting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pseudo-label module augments missing labels
Weighted analytic classifier balances class contributions
Exemplar-free approach avoids storing past samples
πŸ”Ž Similar Papers
No similar papers found.
X
Xiang Zhang
Shien-Ming Wu School of Intelligent Engineering, South China University of Technology, Guangzhou, China
Run He
Run He
South China university of Technology
Deep LearningContinual LearningFederated LearningLLM
J
Jiao Chen
Shien-Ming Wu School of Intelligent Engineering, South China University of Technology, Guangzhou, China
Di Fang
Di Fang
South China University of Technology
Continual Learning
M
Ming Li
Guangdong Laboratory of Artificial Intelligence and Digital Economy (SZ)
Ziqian Zeng
Ziqian Zeng
Associate Professor at South China University of Technology
Natural Language Processing
C
Cen Chen
School of Future Technology, South China University of Technology, Guangzhou, China
Huiping Zhuang
Huiping Zhuang
Associate Professor, South China University of Technology
Continual LearningMulti-ModalEmbodied AILarge Model