Mitigating Persistent Client Dropout in Asynchronous Decentralized Federated Learning

📅 2025-08-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In asynchronous decentralized federated learning (DFL), persistent client dropouts lead to unrecoverable model updates, missing neighbor information, and infeasible loss-function reconstruction. Method: This paper proposes an adaptive client reconstruction strategy that requires neither real data nor global synchronization. It leverages local regularization and dynamic neighbor-model reconstruction to robustly compensate for long-term offline clients within a purely asynchronous, coordinator-free DFL framework. Contribution/Results: Evaluated across three mainstream DFL algorithms on tabular and image datasets—including IID, Non-IID, and class-imbalanced Non-IID settings—the proposed strategy significantly mitigates performance degradation caused by dropouts, enhances convergence stability, and improves system robustness. Crucially, it operates without prior knowledge of data distribution or precise estimation of neighbor states.

Technology Category

Application Category

📝 Abstract
We consider the problem of persistent client dropout in asynchronous Decentralized Federated Learning (DFL). Asynchronicity and decentralization obfuscate information about model updates among federation peers, making recovery from a client dropout difficult. Access to the number of learning epochs, data distributions, and all the information necessary to precisely reconstruct the missing neighbor's loss functions is limited. We show that obvious mitigations do not adequately address the problem and introduce adaptive strategies based on client reconstruction. We show that these strategies can effectively recover some performance loss caused by dropout. Our work focuses on asynchronous DFL with local regularization and differs substantially from that in the existing literature. We evaluate the proposed methods on tabular and image datasets, involve three DFL algorithms, and three data heterogeneity scenarios (iid, non-iid, class-focused non-iid). Our experiments show that the proposed adaptive strategies can be effective in maintaining robustness of federated learning, even if they do not reconstruct the missing client's data precisely. We also discuss the limitations and identify future avenues for tackling the problem of client dropout.
Problem

Research questions and friction points this paper is trying to address.

Addressing persistent client dropout in asynchronous decentralized federated learning
Overcoming information obfuscation in model updates among peers
Mitigating performance loss without precise client data reconstruction
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive client reconstruction strategies
Asynchronous decentralized federated learning
Local regularization for robustness
🔎 Similar Papers
No similar papers found.