FedEve: On Bridging the Client Drift and Period Drift for Cross-device Federated Learning

📅 2025-08-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In cross-device federated learning, data heterogeneity and partial client participation induce client drift and—novelly identified in this work—periodic drift. This paper formally defines periodic drift and reveals its coupling effect with client drift. To jointly mitigate both phenomena, we propose FedEve, a framework built upon a prediction-observation dual-path architecture. FedEve integrates local non-IID updates, gradient compensation, and variance-aware aggregation to achieve synergistic suppression of both drift types. Extensive experiments on standard cross-device non-IID benchmarks demonstrate that FedEve significantly improves model accuracy (+2.1–4.7%) and accelerates convergence (1.8–2.5× faster) compared to state-of-the-art methods including FedAvg and SCAFFOLD.

Technology Category

Application Category

📝 Abstract
Federated learning (FL) is a machine learning paradigm that allows multiple clients to collaboratively train a shared model without exposing their private data. Data heterogeneity is a fundamental challenge in FL, which can result in poor convergence and performance degradation. Client drift has been recognized as one of the factors contributing to this issue resulting from the multiple local updates in FedAvg. However, in cross-device FL, a different form of drift arises due to the partial client participation, but it has not been studied well. This drift, we referred as period drift, occurs as participating clients at each communication round may exhibit distinct data distribution that deviates from that of all clients. It could be more harmful than client drift since the optimization objective shifts with every round. In this paper, we investigate the interaction between period drift and client drift, finding that period drift can have a particularly detrimental effect on cross-device FL as the degree of data heterogeneity increases. To tackle these issues, we propose a predict-observe framework and present an instantiated method, FedEve, where these two types of drift can compensate each other to mitigate their overall impact. We provide theoretical evidence that our approach can reduce the variance of model updates. Extensive experiments demonstrate that our method outperforms alternatives on non-iid data in cross-device settings.
Problem

Research questions and friction points this paper is trying to address.

Addressing client drift from multiple local updates
Mitigating period drift due to partial client participation
Reducing performance degradation in non-IID federated learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proposes predict-observe framework to compensate drifts
Introduces FedEve method reducing update variance
Addresses client and period drift in federated learning
🔎 Similar Papers
No similar papers found.
T
Tao Shen
College of Computer Science and Technology, Zhejiang University, China
Zexi Li
Zexi Li
Alibaba Group
Deep LearningLarge Language ModelsFederated Learning
Didi Zhu
Didi Zhu
Imperial College London
Multi-Modal LLMsOut of Distribution Generalization
Ziyu Zhao
Ziyu Zhao
University of South Carolina
computer vision. 2D/3D segmentationGenerative 3D reconstruction
C
Chao Wu
School of Public Affairs, Zhejiang University, China
F
Fei Wu
College of Computer Science and Technology, Zhejiang University, China