🤖 AI Summary
To address the lack of robustness in model aggregation caused by communication noise and parameter corruption in federated learning, this paper proposes a topology-aware robust aggregation method that jointly models the graph-structured relationships among client parameters while simultaneously recovering corrupted parameters. Innovatively, it unifies graph learning and signal recovery into a difference-of-convex (DC) optimization framework, solved via a proximal DC algorithm integrated with graph signal processing techniques to achieve structured parameter reconstruction under noise. Extensive experiments on MNIST and CIFAR-10 demonstrate that the method consistently outperforms state-of-the-art baselines by 2–5% in classification accuracy under non-IID (skewed) data distributions and diverse communication noise conditions. To the best of our knowledge, this is the first work to realize end-to-end, graph-structure-guided robust parameter recovery and aggregation in federated learning.
📝 Abstract
We propose a robust aggregation method for model parameters in federated learning (FL) under noisy communications. FL is a distributed machine learning paradigm in which a central server aggregates local model parameters from multiple clients. These parameters are often noisy and/or have missing values during data collection, training, and communication between the clients and server. This may cause a considerable drop in model accuracy. To address this issue, we learn a graph that represents pairwise relationships between model parameters of the clients during aggregation. We realize it with a joint problem of graph learning and signal (i.e., model parameters) restoration. The problem is formulated as a difference-of-convex (DC) optimization, which is efficiently solved via a proximal DC algorithm. Experimental results on MNIST and CIFAR-10 datasets show that the proposed method outperforms existing approaches by up to $2$--$5%$ in classification accuracy under biased data distributions and noisy conditions.