FedERL: Federated Efficient and Robust Learning for Common Corruptions

📅 2025-08-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In federated learning, clients often suffer from resource constraints and exhibit poor robustness against common data corruptions—such as noise, blur, and weather artifacts—while existing robust training methods impose prohibitive computational overhead, hindering practical deployment. To address this, we propose FedERL, the first framework achieving *zero client-side robustness overhead*: all robustness enhancement—including Data-Agnostic Robust Training (DART)—is performed exclusively on the server, without requiring clients to access raw data or execute any additional robust operations. DART introduces lightweight, data-agnostic adversarial augmentation and regularization on the server, balancing efficiency and corruption resilience. Experiments demonstrate that FedERL reduces average client inference time and energy consumption by over 60%, while attaining superior robust accuracy compared to state-of-the-art federated robust methods—particularly under severe resource constraints.

Technology Category

Application Category

📝 Abstract
Federated learning (FL) accelerates the deployment of deep learning models on edge devices while preserving data privacy. However, FL systems face challenges due to client-side constraints on computational resources, and from a lack of robustness to common corruptions such as noise, blur, and weather effects. Existing robust training methods are computationally expensive and unsuitable for resource-constrained clients. We propose FedERL, federated efficient and robust learning, as the first work to explicitly address corruption robustness under time and energy constraints on the client side. At its core, FedERL employs a novel data-agnostic robust training (DART) method on the server to enhance robustness without access to the training data. In doing so, FedERL ensures zero robustness overhead for clients. Extensive experiments demonstrate FedERL's ability to handle common corruptions at a fraction of the time and energy cost of traditional robust training methods. In scenarios with limited time and energy budgets, FedERL surpasses the performance of traditional robust training, establishing it as a practical and scalable solution for real-world FL applications.
Problem

Research questions and friction points this paper is trying to address.

Addresses client-side computational constraints in federated learning
Enhances robustness against common corruptions like noise and blur
Reduces time and energy costs for robust training methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Server-side data-agnostic robust training
Zero robustness overhead for clients
Handles corruptions with reduced time energy
🔎 Similar Papers
No similar papers found.
O
Omar Bekdache
Department of Electrical and Computer Engineering, University of Illinois Urbana-Champaign, Urbana, IL 61801
Naresh Shanbhag
Naresh Shanbhag
Professor of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign
circuitssignal processingcommunicationsmachine learning