🤖 AI Summary
This work addresses three key challenges in differentially private statistical estimation of black-box functions: unknown sensitivity, low query efficiency, and suboptimal data utilization. We propose the first general framework that achieves both statistical and oracle query efficiency without prior knowledge of sensitivity. Our method integrates an adaptive querying mechanism with customized noise injection, guaranteeing ε-differential privacy while drastically reducing dependence on function evaluations—scaling far better than exponential in the problem parameters. We establish a tight theoretical lower bound on the privacy–utility trade-off and prove that our framework attains this bound. Empirical evaluation across multiple benchmark tasks demonstrates that our approach achieves higher estimation accuracy with significantly fewer oracle queries compared to existing differentially private black-box estimators.
📝 Abstract
Standard techniques for differentially private estimation, such as Laplace or Gaussian noise addition, require guaranteed bounds on the sensitivity of the estimator in question. But such sensitivity bounds are often large or simply unknown. Thus we seek differentially private methods that can be applied to arbitrary black-box functions. A handful of such techniques exist, but all are either inefficient in their use of data or require evaluating the function on exponentially many inputs. In this work we present a scheme that trades off between statistical efficiency (i.e., how much data is needed) and oracle efficiency (i.e., the number of evaluations). We also present lower bounds showing the near-optimality of our scheme.