Graph Structure Learning with Privacy Guarantees for Open Graph Data

📅 2025-07-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Balancing privacy compliance (e.g., GDPR) and data utility in open graph data publishing remains challenging—especially when data providers and consumers are decoupled. Existing privacy-preserving graph data publishing (PPDP) methods often lack rigorous, end-to-end privacy guarantees at the release stage. Method: This paper introduces Gaussian Differential Privacy (GDP) into the graph publishing phase for the first time, proposing a structured noise injection framework. Grounded in theoretical guarantees, it enables unbiased recovery of graph structure and supports discrete graph-valued variables. By integrating graph structure estimation theory with joint privacy–utility optimization, the framework achieves provable privacy protection without compromising analytical fidelity. Contribution/Results: The method significantly outperforms baselines on downstream tasks—including node classification and link prediction—while maintaining high model utility under stringent privacy budgets. It establishes a formally verifiable, practically deployable paradigm for secure, open sharing of graph data.

Technology Category

Application Category

📝 Abstract
Ensuring privacy in large-scale open datasets is increasingly challenging under regulations such as the General Data Protection Regulation (GDPR). While differential privacy (DP) provides strong theoretical guarantees, it primarily focuses on noise injection during model training, neglecting privacy preservation at the data publishing stage. Existing privacy-preserving data publishing (PPDP) approaches struggle to balance privacy and utility, particularly when data publishers and users are distinct entities. To address this gap, we focus on the graph recovery problem and propose a novel privacy-preserving estimation framework for open graph data, leveraging Gaussian DP (GDP) with a structured noise-injection mechanism. Unlike traditional methods that perturb gradients or model updates, our approach ensures unbiased graph structure recovery while enforcing DP at the data publishing stage. Moreover, we provide theoretical guarantees on estimation accuracy and extend our method to discrete-variable graphs, a setting often overlooked in DP research. Experimental results in graph learning demonstrate robust performance, offering a viable solution for privacy-conscious graph analysis.
Problem

Research questions and friction points this paper is trying to address.

Ensuring privacy in open graph data under GDPR regulations
Balancing privacy and utility in graph data publishing
Recovering graph structures with differential privacy guarantees
Innovation

Methods, ideas, or system contributions that make the work stand out.

GDPR-compliant graph structure learning
Gaussian DP with structured noise
Unbiased recovery for discrete graphs
🔎 Similar Papers
2024-02-29International Conference on Learning RepresentationsCitations: 0
M
Muhao Guo
Arizona State University, Tempe AZ 85281, USA
J
Jiaqi Wu
Arizona State University, Tempe AZ 85281, USA
Yang Weng
Yang Weng
Associate Professor, School of Electrical, Computer, and Energy Eng., Arizona State University
Machine Learning for Power Systems
Yizheng Liao
Yizheng Liao
Stanford University
Statistical LearningStructural Health MonitoringSmart GridData Analytics
S
Shengzhe Chen
Arizona State University, Tempe AZ 85281, USA