Subgraph Federated Learning via Spectral Methods

📅 2025-10-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph federated learning (GFL) methods for cross-client subgraph interconnection suffer from privacy risks due to node embedding leakage or face scalability bottlenecks from computationally expensive operations. Method: We propose FedLap, the first GFL framework enabling strong privacy guarantees without transmitting sensitive node embeddings. FedLap leverages spectral-domain Laplacian smoothing to fuse global topological structure and model cross-subgraph node dependencies in the frequency domain—eliminating explicit embedding exchange and dense graph convolutions. It supports fully decentralized training with low communication overhead and provides theoretical privacy security. Results: Extensive experiments on multiple benchmark datasets demonstrate that FedLap achieves performance comparable to or better than state-of-the-art methods under strict privacy constraints, while significantly reducing communication and computational costs.

Technology Category

Application Category

📝 Abstract
We consider the problem of federated learning (FL) with graph-structured data distributed across multiple clients. In particular, we address the prevalent scenario of interconnected subgraphs, where interconnections between clients significantly influence the learning process. Existing approaches suffer from critical limitations, either requiring the exchange of sensitive node embeddings, thereby posing privacy risks, or relying on computationally-intensive steps, which hinders scalability. To tackle these challenges, we propose FedLap, a novel framework that leverages global structure information via Laplacian smoothing in the spectral domain to effectively capture inter-node dependencies while ensuring privacy and scalability. We provide a formal analysis of the privacy of FedLap, demonstrating that it preserves privacy. Notably, FedLap is the first subgraph FL scheme with strong privacy guarantees. Extensive experiments on benchmark datasets demonstrate that FedLap achieves competitive or superior utility compared to existing techniques.
Problem

Research questions and friction points this paper is trying to address.

Federated learning with distributed graph-structured data
Addressing interconnected subgraphs across multiple clients
Overcoming privacy risks and scalability limitations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Laplacian smoothing in spectral domain
Preserves privacy without sharing embeddings
First subgraph FL with strong privacy guarantees
🔎 Similar Papers
No similar papers found.