đ¤ AI Summary
This paper addresses the âheterogeneity amplificationâ problem in asynchronous federated learning (AFL), wherein uneven client participationâespecially frequent updates from fast clients under non-IID dataâexacerbates global model bias and slows convergence. To mitigate this, we propose a full-client participation mechanism and design two buffer-free, immediate-update algorithms: ACE (Asynchronous Client Engagement) and its delay-aware variant ACED. Both employ dynamic participation control, latency-adaptive weight adjustment, and full-client gradient aggregation, resolving the tension between update staleness and client heterogeneity without introducing auxiliary storage overhead. Theoretical analysis establishes convergence guarantees under realistic asynchrony and heterogeneity. Extensive experiments across multiple tasks demonstrate that ACE and ACED significantly improve convergence speed and model robustnessâparticularly in high-heterogeneity and high-latency regimesâoutperforming state-of-the-art AFL baselines.
đ Abstract
In Asynchronous Federated Learning (AFL), the central server immediately updates the global model with each arriving client's contribution. As a result, clients perform their local training on different model versions, causing information staleness (delay). In federated environments with non-IID local data distributions, this asynchronous pattern amplifies the adverse effect of client heterogeneity (due to different data distribution, local objectives, etc.), as faster clients contribute more frequent updates, biasing the global model. We term this phenomenon heterogeneity amplification. Our work provides a theoretical analysis that maps AFL design choices to their resulting error sources when heterogeneity amplification occurs. Guided by our analysis, we propose ACE (All-Client Engagement AFL), which mitigates participation imbalance through immediate, non-buffered updates that use the latest information available from all clients. We also introduce a delay-aware variant, ACED, to balance client diversity against update staleness. Experiments on different models for different tasks across diverse heterogeneity and delay settings validate our analysis and demonstrate the robust performance of our approaches.