🤖 AI Summary
Federated learning is vulnerable to poisoning attacks, and existing defenses rely on external datasets or assume a fixed proportion of malicious clients, limiting generalizability and scalability. To address this, we propose the first server-side defense framework that requires no external data and makes no assumptions about the fraction of malicious clients. Our approach leverages a conditional generative adversarial network (cGAN) to dynamically synthesize task-relevant, discriminative synthetic data; integrates update consistency checking; and incorporates a differential privacy enhancement module to verify the authenticity of client model updates in real time prior to aggregation. The framework is fully adaptive and end-to-end integrable. Evaluated on CIFAR-10 and MNIST, it achieves >96% true positive rate and >94% true negative rate, with <1.2% accuracy degradation—outperforming state-of-the-art methods in robustness and practicality.
📝 Abstract
Federated Learning (FL) enables collaborative model training across decentralized devices without sharing raw data, but it remains vulnerable to poisoning attacks that compromise model integrity. Existing defenses often rely on external datasets or predefined heuristics (e.g. number of malicious clients), limiting their effectiveness and scalability. To address these limitations, we propose a privacy-preserving defense framework that leverages a Conditional Generative Adversarial Network (cGAN) to generate synthetic data at the server for authenticating client updates, eliminating the need for external datasets. Our framework is scalable, adaptive, and seamlessly integrates into FL workflows. Extensive experiments on benchmark datasets demonstrate its robust performance against a variety of poisoning attacks, achieving high True Positive Rate (TPR) and True Negative Rate (TNR) of malicious and benign clients, respectively, while maintaining model accuracy. The proposed framework offers a practical and effective solution for securing federated learning systems.