Semi-decentralized Federated Time Series Prediction with Client Availability Budgets

πŸ“… 2025-09-03
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address client data heterogeneity, energy constraints, and dynamic disconnections in federated learning for IoT time-series forecasting, this paper proposes FedDeCABβ€”a semi-decentralized client selection and collaborative optimization framework. Its core innovations include: (i) a probabilistic ranking mechanism for efficient client selection, and (ii) localized model sharing and joint optimization among neighboring clients upon disconnection, thereby mitigating the adverse impact of availability fluctuations on convergence. FedDeCAB maintains low communication overhead while significantly improving global model convergence speed and generalization performance. Extensive experiments on large-scale taxi and vessel trajectory datasets demonstrate that, under highly heterogeneous and dynamic settings, FedDeCAB reduces communication cost by 32.7% on average and accelerates convergence by 1.8Γ— compared to baseline methods, validating its robustness and practicality for real-world IoT deployments.

Technology Category

Application Category

πŸ“ Abstract
Federated learning (FL) effectively promotes collaborative training among distributed clients with privacy considerations in the Internet of Things (IoT) scenarios. Despite of data heterogeneity, FL clients may also be constrained by limited energy and availability budgets. Therefore, effective selection of clients participating in training is of vital importance for the convergence of the global model and the balance of client contributions. In this paper, we discuss the performance impact of client availability with time-series data on federated learning. We set up three different scenarios that affect the availability of time-series data and propose FedDeCAB, a novel, semi-decentralized client selection method applying probabilistic rankings of available clients. When a client is disconnected from the server, FedDeCAB allows obtaining partial model parameters from the nearest neighbor clients for joint optimization, improving the performance of offline models and reducing communication overhead. Experiments based on real-world large-scale taxi and vessel trajectory datasets show that FedDeCAB is effective under highly heterogeneous data distribution, limited communication budget, and dynamic client offline or rejoining.
Problem

Research questions and friction points this paper is trying to address.

Client selection with limited energy and availability budgets
Handling time-series data heterogeneity in federated learning
Reducing communication overhead while maintaining model performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Semi-decentralized client selection with probabilistic rankings
Partial model parameter sharing from nearest neighbors
Optimization for offline models reducing communication overhead
πŸ”Ž Similar Papers
No similar papers found.
Y
Yunkai Bao
Department of Electrical and Software Engineering, University of Calgary, Calgary, AB, Canada
R
Reza Safarzadeh
Department of Geomatics Engineering, University of Calgary, Calgary, AB, Canada
X
Xin Wang
Department of Geomatics Engineering, University of Calgary, Calgary, AB, Canada
Steve Drew
Steve Drew
Assistant Professor at University of Calgary
Edge AIIoTMachine Learning