GRANITE : a Byzantine-Resilient Dynamic Gossip Learning Framework

📅 2025-04-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses Byzantine model poisoning attacks in dynamic gossip learning, where malicious nodes manipulate the random peer sampling (RPS) protocol. We propose History-aware Byzantine-resilient Peer Sampling (HaPS) and Adaptive Probability Thresholding (APT). HaPS introduces historical node labeling and RPS enhancement on dynamic sparse graphs, while APT enables adaptive threshold-based aggregation—jointly providing formal robustness guarantees for distributed model aggregation. Theoretically, our method ensures convergence under up to 30% Byzantine nodes and supports communication graphs nine times sparser than state-of-the-art approaches. Experiments demonstrate significant improvements in both adversarial robustness and communication efficiency, achieving a favorable trade-off between resilience and scalability.

Technology Category

Application Category

📝 Abstract
Gossip Learning (GL) is a decentralized learning paradigm where users iteratively exchange and aggregate models with a small set of neighboring peers. Recent GL approaches rely on dynamic communication graphs built and maintained using Random Peer Sampling (RPS) protocols. Thanks to graph dynamics, GL can achieve fast convergence even over extremely sparse topologies. However, the robustness of GL over dy- namic graphs to Byzantine (model poisoning) attacks remains unaddressed especially when Byzantine nodes attack the RPS protocol to scale up model poisoning. We address this issue by introducing GRANITE, a framework for robust learning over sparse, dynamic graphs in the presence of a fraction of Byzantine nodes. GRANITE relies on two key components (i) a History-aware Byzantine-resilient Peer Sampling protocol (HaPS), which tracks previously encountered identifiers to reduce adversarial influence over time, and (ii) an Adaptive Probabilistic Threshold (APT), which leverages an estimate of Byzantine presence to set aggregation thresholds with formal guarantees. Empirical results confirm that GRANITE maintains convergence with up to 30% Byzantine nodes, improves learning speed via adaptive filtering of poisoned models and obtains these results in up to 9 times sparser graphs than dictated by current theory.
Problem

Research questions and friction points this paper is trying to address.

Ensuring Byzantine resilience in dynamic Gossip Learning
Mitigating model poisoning attacks in sparse communication graphs
Improving learning speed with adaptive Byzantine node filtering
Innovation

Methods, ideas, or system contributions that make the work stand out.

History-aware Byzantine-resilient Peer Sampling protocol
Adaptive Probabilistic Threshold for aggregation
Robust learning over sparse dynamic graphs
🔎 Similar Papers
No similar papers found.