🤖 AI Summary
This work addresses catastrophic forgetting in class-incremental continual learning by proposing a lightweight, task-aware incremental prompting method that requires no rehearsal of old data. The core innovation lies in an adaptive key-learner and a task-identifier-driven dynamic prompt generation mechanism, which jointly models both general-purpose and task-specific knowledge. Additionally, a learnable key-value memory module is introduced to facilitate cross-task knowledge transfer. Evaluated on standard continual learning benchmarks—including CIFAR-100 and ImageNet-100—the method achieves significant improvements over state-of-the-art approaches: average accuracy increases by 2.1–4.7 percentage points, while forgetting rates decrease by up to 37%. These results demonstrate the method’s superior knowledge accumulation capability, generalization performance, and parameter efficiency.
📝 Abstract
This paper introduces INCPrompt, an innovative continual learning solution that effectively addresses catastrophic forgetting. INCPrompt’s key innovation lies in its use of adaptive key-learner and task-aware prompts that capture task-relevant information. This unique combination encapsulates general knowledge across tasks and encodes task-specific knowledge. Our comprehensive evaluation across multiple continual learning benchmarks demonstrates INCPrompt’s superiority over existing algorithms, showing its effectiveness in mitigating catastrophic forgetting while maintaining high performance. These results highlight the significant impact of task-aware incremental prompting on continual learning performance.