Continual Release of Densest Subgraphs: Privacy Amplification & Sublinear Space via Subsampling

📅 2025-10-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies the densest subgraph (DSG) problem under edge differential privacy in the insertion-only data stream model with continual release. To address the limitations of existing static differentially private (DP) algorithms—namely, large additive error—and non-private streaming algorithms—namely, high space overhead—we propose a unified framework integrating subsampling-based privacy amplification and sparsification, formally characterizing the privacy amplification gain from subsampling in graph DP for the first time. We further introduce a graph densification mechanism to eliminate redundant logarithmic factors. Leveraging black-box reductions, our approach seamlessly integrates streaming computation with static DP mechanisms, supporting both pure and approximate DP. Theoretically, our algorithm achieves optimal $O(log n)$ additive error—matching the known lower bound for static DP—and $O(n log n)$ space complexity—matching the space efficiency of non-private streaming algorithms—thus attaining state-of-the-art accuracy and space efficiency simultaneously.

Technology Category

Application Category

📝 Abstract
We study the sublinear space continual release model for edge-differentially private (DP) graph algorithms, with a focus on the densest subgraph problem (DSG) in the insertion-only setting. Our main result is the first continual release DSG algorithm that matches the additive error of the best static DP algorithms and the space complexity of the best non-private streaming algorithms, up to constants. The key idea is a refined use of subsampling that simultaneously achieves privacy amplification and sparsification, a connection not previously formalized in graph DP. Via a simple black-box reduction to the static setting, we obtain both pure and approximate-DP algorithms with $O(log n)$ additive error and $O(nlog n)$ space, improving both accuracy and space complexity over the previous state of the art. Along the way, we introduce graph densification in the graph DP setting, adding edges to trigger earlier subsampling, which removes the extra logarithmic factors in error and space incurred by prior work [ELMZ25]. We believe this simple idea may be of independent interest.
Problem

Research questions and friction points this paper is trying to address.

Achieving edge-differential privacy in continual release densest subgraph algorithms
Matching additive error of static DP algorithms with sublinear space
Using subsampling for simultaneous privacy amplification and sparsification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Subsampling achieves privacy amplification and sparsification
Graph densification removes logarithmic error factors
Black-box reduction enables sublinear space complexity
🔎 Similar Papers
No similar papers found.