C${}^2$Prompt: Class-aware Client Knowledge Interaction for Federated Continual Learning

📅 2025-09-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing prompt-based federated continual learning (FCL) suffers from insufficient inter-class knowledge coherence: intra-class distribution shifts across clients degrade prompt semantic consistency, while inter-class correlations induce cross-class confusion, exacerbating spatio-temporal forgetting. This work is the first to systematically identify and address this issue, proposing a class-aware client knowledge interaction mechanism comprising: (1) local class-distribution compensation to mitigate intra-class shift, and (2) class-aware prompt aggregation to suppress inter-class interference. The method unifies prompt learning, federated optimization, distribution alignment, and selective knowledge aggregation, enabling joint optimization of local compensation and global coordination. Evaluated on multiple FCL benchmarks, it achieves state-of-the-art performance, significantly alleviating spatio-temporal forgetting while enhancing model generalization and stability.

Technology Category

Application Category

📝 Abstract
Federated continual learning (FCL) tackles scenarios of learning from continuously emerging task data across distributed clients, where the key challenge lies in addressing both temporal forgetting over time and spatial forgetting simultaneously. Recently, prompt-based FCL methods have shown advanced performance through task-wise prompt communication.In this study, we underscore that the existing prompt-based FCL methods are prone to class-wise knowledge coherence between prompts across clients. The class-wise knowledge coherence includes two aspects: (1) intra-class distribution gap across clients, which degrades the learned semantics across prompts, (2) inter-prompt class-wise relevance, which highlights cross-class knowledge confusion. During prompt communication, insufficient class-wise coherence exacerbates knowledge conflicts among new prompts and induces interference with old prompts, intensifying both spatial and temporal forgetting. To address these issues, we propose a novel Class-aware Client Knowledge Interaction (C${}^2$Prompt) method that explicitly enhances class-wise knowledge coherence during prompt communication. Specifically, a local class distribution compensation mechanism (LCDC) is introduced to reduce intra-class distribution disparities across clients, thereby reinforcing intra-class knowledge consistency. Additionally, a class-aware prompt aggregation scheme (CPA) is designed to alleviate inter-class knowledge confusion by selectively strengthening class-relevant knowledge aggregation. Extensive experiments on multiple FCL benchmarks demonstrate that C${}^2$Prompt achieves state-of-the-art performance. Our source code is available at https://github.com/zhoujiahuan1991/NeurIPS2025-C2Prompt
Problem

Research questions and friction points this paper is trying to address.

Addresses class-wise knowledge coherence issues in federated continual learning
Reduces intra-class distribution disparities across distributed clients
Alleviates inter-class knowledge confusion during prompt communication
Innovation

Methods, ideas, or system contributions that make the work stand out.

Local class distribution compensation reduces intra-class disparities
Class-aware prompt aggregation alleviates inter-class knowledge confusion
Explicitly enhances class-wise knowledge coherence during prompt communication
K
Kunlun Xu
Wangxuan Institute of Computer Technology, Peking University, Beijing, China
Y
Yibo Feng
Wangxuan Institute of Computer Technology, Peking University, Beijing, China
Jiangmeng Li
Jiangmeng Li
Institute of Software, Chinese Academy of Science
Multi-modal learningSelf-supervised learningDomain generalizationCausal learning
Y
Yongsheng Qi
Inner Mongolia University of Technology, Hohhot, Inner Mongolia Autonomous Region
Jiahuan Zhou
Jiahuan Zhou
Peking University
Computer VisionMachine LearningDeep Learning