DFed-SST: Building Semantic- and Structure-aware Topologies for Decentralized Federated Graph Learning

📅 2025-08-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of jointly modeling topological structure and semantic information in decentralized federated graph learning, this paper proposes a dual-topology adaptive communication mechanism. Under a decentralized architecture, it constructs two complementary communication topologies—semantic-aware and structure-aware—and dynamically optimizes neighbor selection and model aggregation based on structural heterogeneity of local subgraphs. This is the first approach to achieve joint structural-semantic modeling of graph data without a central server, integrating graph neural networks, distributed optimization, and topology-adaptive techniques. Extensive experiments across eight real-world graph datasets demonstrate that the method achieves an average accuracy improvement of 3.26% over state-of-the-art baselines, significantly enhancing generalization capability and communication efficiency—particularly under heterogeneous subgraph settings.

Technology Category

Application Category

📝 Abstract
Decentralized Federated Learning (DFL) has emerged as a robust distributed paradigm that circumvents the single-point-of-failure and communication bottleneck risks of centralized architectures. However, a significant challenge arises as existing DFL optimization strategies, primarily designed for tasks such as computer vision, fail to address the unique topological information inherent in the local subgraph. Notably, while Federated Graph Learning (FGL) is tailored for graph data, it is predominantly implemented in a centralized server-client model, failing to leverage the benefits of decentralization.To bridge this gap, we propose DFed-SST, a decentralized federated graph learning framework with adaptive communication. The core of our method is a dual-topology adaptive communication mechanism that leverages the unique topological features of each client's local subgraph to dynamically construct and optimize the inter-client communication topology. This allows our framework to guide model aggregation efficiently in the face of heterogeneity. Extensive experiments on eight real-world datasets consistently demonstrate the superiority of DFed-SST, achieving 3.26% improvement in average accuracy over baseline methods.
Problem

Research questions and friction points this paper is trying to address.

Addresses decentralized federated learning for graph data
Overcomes limitations of centralized graph learning models
Optimizes inter-client communication using local subgraph topologies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dual-topology adaptive communication mechanism
Dynamic inter-client topology construction
Semantic- and structure-aware model aggregation
🔎 Similar Papers
No similar papers found.
L
Lianshuai Guo
Shandong University, School of Mechanical, Electrical and Information Engineering, Weihai, 264209, China
Zhongzheng Yuan
Zhongzheng Yuan
Shandong University, School of Mechanical, Electrical and Information Engineering, Weihai, 264209, China
Xunkai Li
Xunkai Li
School of Computer Science and Technology, Beijing Institution of Technology
Data-centric AIGraph MLAI4Science
Yinlin Zhu
Yinlin Zhu
Sun Yat-sen University
Graph Neural NetworksFederated Learning
M
Meixia Qu
Shandong University, School of Mechanical, Electrical and Information Engineering, Weihai, 264209, China
W
Wenyu Wang
Shandong University, School of Mechanical, Electrical and Information Engineering, Weihai, 264209, China