Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning

📅 2025-08-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing class-incremental learning methods suffer from suboptimal adapter selection during inference and neglect cross-task knowledge sharing via task-specific adapters, leading to misclassification of similar classes and catastrophic forgetting. To address these issues, we propose an incremental learning framework that jointly leverages task-specific and task-agnostic adapters. Specifically, we design an entropy-driven dynamic adapter selection mechanism for fine-grained, task-level discrimination, and introduce a lightweight universal adapter whose feature fusion with task-specific adapters is jointly optimized to explicitly model semantic commonalities across tasks. Crucially, our approach achieves these improvements without increasing the parameter count of the pre-trained backbone. Extensive experiments on multiple standard benchmarks demonstrate state-of-the-art performance, with significant gains in both classification accuracy and stability over existing methods.

Technology Category

Application Category

📝 Abstract
Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Existing pre-trained model-based CIL methods often freeze the pre-trained network and adapt to incremental tasks using additional lightweight modules such as adapters. However, incorrect module selection during inference hurts performance, and task-specific modules often overlook shared general knowledge, leading to errors on distinguishing between similar classes across tasks. To address the aforementioned challenges, we propose integrating Task-Specific and Universal Adapters (TUNA) in this paper. Specifically, we train task-specific adapters to capture the most crucial features relevant to their respective tasks and introduce an entropy-based selection mechanism to choose the most suitable adapter. Furthermore, we leverage an adapter fusion strategy to construct a universal adapter, which encodes the most discriminative features shared across tasks. We combine task-specific and universal adapter predictions to harness both specialized and general knowledge during inference. Extensive experiments on various benchmark datasets demonstrate the state-of-the-art performance of our approach. Code is available at: https://github.com/LAMDA-CL/ICCV2025-TUNA
Problem

Research questions and friction points this paper is trying to address.

Addresses incorrect module selection in incremental learning inference
Solves task-specific adapters overlooking shared general knowledge
Improves distinguishing similar classes across incremental learning tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrating task-specific and universal adapters
Entropy-based selection mechanism for adapters
Adapter fusion strategy for shared features
🔎 Similar Papers
No similar papers found.
Y
Yan Wang
School of Artificial Intelligence, Nanjing University; National Key Laboratory for Novel Software Technology, Nanjing University
Da-Wei Zhou
Da-Wei Zhou
Associate Researcher, Nanjing University
Incremental LearningContinual LearningOpen-World LearningModel Reuse
Han-Jia Ye
Han-Jia Ye
Nanjing University
Machine LearningData MiningMetric LearningMeta-Learning