Federated Class-Incremental Learning with Prompting

πŸ“… 2023-10-13
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 2
✨ Influential: 1
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the practical challenge in Federated Class-Incremental Learning (FCIL) where non-IID data distributions and catastrophic forgetting co-occur. We propose a novel prompting-based framework to tackle both issues simultaneously. Methodologically, we integrate task-aware knowledge encoding, federated aggregation, and dynamic prompt pool re-ranking. Our key contributions are: (1) the first introduction of prompting into FCILβ€”enabling joint encoding of task-relevant and task-irrelevant knowledge without replay buffers; and (2) a local prompt pool with task-information alignment to mitigate non-IID bias caused by client-level class absence. Evaluated on CIFAR-100, Mini-ImageNet, and Tiny-ImageNet, our approach achieves state-of-the-art performance, significantly outperforming existing methods in both classification accuracy and forgetting resistance. The framework thus advances FCIL by unifying robustness to data heterogeneity and continual learning stability within a lightweight, buffer-free prompting paradigm.
πŸ“ Abstract
As Web technology continues to develop, it has become increasingly common to use data stored on different clients. At the same time, federated learning has received widespread attention due to its ability to protect data privacy when let models learn from data which is distributed across various clients. However, most existing works assume that the client's data are fixed. In real-world scenarios, such an assumption is most likely not true as data may be continuously generated and new classes may also appear. To this end, we focus on the practical and challenging federated class-incremental learning (FCIL) problem. For FCIL, the local and global models may suffer from catastrophic forgetting on old classes caused by the arrival of new classes and the data distributions of clients are non-independent and identically distributed (non-iid). In this paper, we propose a novel method called Federated Class-Incremental Learning with PrompTing (FCILPT). Given the privacy and limited memory, FCILPT does not use a rehearsal-based buffer to keep exemplars of old data. We choose to use prompts to ease the catastrophic forgetting of the old classes. Specifically, we encode the task-relevant and task-irrelevant knowledge into prompts, preserving the old and new knowledge of the local clients and solving the problem of catastrophic forgetting. We first sort the task information in the prompt pool in the local clients to align the task information on different clients before global aggregation. It ensures that the same task's knowledge are fully integrated, solving the problem of non-iid caused by the lack of classes among different clients in the same incremental task. Experiments on CIFAR-100, Mini-ImageNet, and Tiny-ImageNet demonstrate that FCILPT achieves significant accuracy improvements over the state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Address catastrophic forgetting in federated class-incremental learning
Handle non-iid data distributions across clients in FCIL
Improve accuracy without rehearsal buffers in privacy-preserving FCIL
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses prompting to prevent catastrophic forgetting
Encodes task-relevant and irrelevant knowledge in prompts
Aligns task information before global aggregation
πŸ”Ž Similar Papers
No similar papers found.