Panther: A Cost-Effective Privacy-Preserving Framework for GNN Training and Inference Services in Cloud Environments

📅 2025-11-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address data privacy risks and high computational costs in cloud-based Graph Neural Network (GNN) training and inference, this paper proposes the first privacy-preserving computing framework integrating a four-party asynchronous secure array access protocol with a node-neighborhood random padding mechanism. The framework ensures end-to-end security while significantly reducing computational and communication overhead. Experiments demonstrate that, compared to state-of-the-art methods, it achieves average reductions of 75.28% in training time and 82.80% in inference time, along with 52.61% and 50.26% decreases in communication volume, respectively; estimated cost savings on Google Cloud reach 55.05% and 59.00%. The core contribution lies in the first joint design of asynchronous multi-party secure computation and lightweight privacy protection tailored for structured graph data—effectively balancing security guarantees, computational efficiency, and practical deployability.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) have marked significant impact in traffic state prediction, social recommendation, knowledge-aware question answering and so on. As more and more users move towards cloud computing, it has become a critical issue to unleash the power of GNNs while protecting the privacy in cloud environments. Specifically, the training data and inference data for GNNs need to be protected from being stolen by external adversaries. Meanwhile, the financial cost of cloud computing is another primary concern for users. Therefore, although existing studies have proposed privacy-preserving techniques for GNNs in cloud environments, their additional computational and communication overhead remain relatively high, causing high financial costs that limit their widespread adoption among users. To protect GNN privacy while lowering the additional financial costs, we introduce Panther, a cost-effective privacy-preserving framework for GNN training and inference services in cloud environments. Technically, Panther leverages four-party computation to asynchronously executing the secure array access protocol, and randomly pads the neighbor information of GNN nodes. We prove that Panther can protect privacy for both training and inference of GNN models. Our evaluation shows that Panther reduces the training and inference time by an average of 75.28% and 82.80%, respectively, and communication overhead by an average of 52.61% and 50.26% compared with the state-of-the-art, which is estimated to save an average of 55.05% and 59.00% in financial costs (based on on-demand pricing model) for the GNN training and inference process on Google Cloud Platform.
Problem

Research questions and friction points this paper is trying to address.

Protecting GNN training and inference data from external adversaries
Reducing high computational and communication costs in cloud environments
Providing cost-effective privacy preservation for GNN cloud services
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses four-party computation for secure array access
Asynchronously executes privacy-preserving protocols
Randomly pads neighbor information in GNN nodes
🔎 Similar Papers
No similar papers found.
C
Congcong Chen
School of Computer Science and Technology, Tongji University, Shanghai 201804, China
X
Xinyu Liu
School of Computer Science and Technology, Tongji University, Shanghai 201804, China
Kaifeng Huang
Kaifeng Huang
Tongji Univerisity
OSS Supply ChainSoftware Engineering
Lifei Wei
Lifei Wei
College of Information Engineering, Shanghai Maritime University, Shanghai 201306, China
Y
Yang Shi
School of Computer Science and Technology, Tongji University, Shanghai 201804, China