Resource-Constrained Federated Continual Learning: What Does Matter?

📅 2025-01-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited practicality of Federated Continual Learning (FCL) on resource-constrained edge devices—characterized by scarce storage, limited computational capacity, and low annotation rates—this work introduces the first large-scale FCL benchmark explicitly designed for resource constraints, covering six diverse datasets under both class-incremental and domain-incremental settings. Through systematic experiments consuming over 1,000 GPU hours, we evaluate the performance degradation of mainstream FCL approaches—including replay-based, regularization-based, and parameter-isolation methods—across multiple resource dimensions. Results reveal significant performance collapse under resource constraints, exposing their inherent dependence on abundant resources. Consequently, we propose a lightweight-deployment-oriented FCL evaluation paradigm and establish joint resource-performance optimization as a core research direction. This work provides critical insights, a reproducible benchmark, and methodological foundations to advance practical FCL deployment.

Technology Category

Application Category

📝 Abstract
Federated Continual Learning (FCL) aims to enable sequentially privacy-preserving model training on streams of incoming data that vary in edge devices by preserving previous knowledge while adapting to new data. Current FCL literature focuses on restricted data privacy and access to previously seen data while imposing no constraints on the training overhead. This is unreasonable for FCL applications in real-world scenarios, where edge devices are primarily constrained by resources such as storage, computational budget, and label rate. We revisit this problem with a large-scale benchmark and analyze the performance of state-of-the-art FCL approaches under different resource-constrained settings. Various typical FCL techniques and six datasets in two incremental learning scenarios (Class-IL and Domain-IL) are involved in our experiments. Through extensive experiments amounting to a total of over 1,000+ GPU hours, we find that, under limited resource-constrained settings, existing FCL approaches, with no exception, fail to achieve the expected performance. Our conclusions are consistent in the sensitivity analysis. This suggests that most existing FCL methods are particularly too resource-dependent for real-world deployment. Moreover, we study the performance of typical FCL techniques with resource constraints and shed light on future research directions in FCL.
Problem

Research questions and friction points this paper is trying to address.

Federated Continual Learning
Resource Efficiency
Performance Optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Continual Learning
Resource Efficiency
Performance Evaluation
🔎 Similar Papers
No similar papers found.
Y
Yichen Li
School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, China
Y
Yuying Wang
School of Computer Science and Technology, Soochow University, Suzhou, China
J
Jiahua Dong
Mohamed bin Zayed University of Artificial Intelligence, Abu Dhabi, United Arab Emirates
Haozhao Wang
Haozhao Wang
Huazhong University of Science and Technology
Could-edge Distributed LearningFederated LearningAI SecurityMulti-modal LLM Agent
Yining Qi
Yining Qi
Huazhong University of Science and Technology
federated learningdata securityprovable data possession
R
Rui Zhang
School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, China
R
Ruixuan Li
School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, China