🤖 AI Summary
This work addresses the fundamental incompatibility between differential privacy and sublinear-time algorithms, tackling the challenge that efficiency and privacy are often mutually exclusive in large-scale data analysis.
Method: We construct a hard instance based on one-way edge queries and combine information-theoretic lower bound techniques with privacy–efficiency trade-off analysis to rigorously establish impossibility results.
Contribution/Results: We prove the first unconditional lower bound showing that strict sublinear-time algorithms—whose query complexity is asymptotically *strictly less* than the input size—cannot satisfy differential privacy for a canonical class of one-dimensional marginal release problems. This establishes the first impossibility framework for simultaneously achieving strict sublinearity and differential privacy, filling a critical gap in the theoretical foundations at the intersection of private computation and sublinear-time algorithms.
📝 Abstract
Differential privacy and sublinear algorithms are both rapidly emerging algorithmic themes in times of big data analysis. Although recent works have shown the existence of differentially private sublinear algorithms for many problems including graph parameter estimation and clustering, little is known regarding hardness results on these algorithms. In this paper, we initiate the study of lower bounds for problems that aim for both differentially-private and sublinear-time algorithms. Our main result is the incompatibility of both the desiderata in the general case. In particular, we prove that a simple problem based on one-way marginals yields both a differentially-private algorithm, as well as a sublinear-time algorithm, but does not admit a ``strictly'' sublinear-time algorithm that is also differentially private.