🤖 AI Summary
This work addresses the relaxation of privacy bounds for Gaussian sketching under Rényi Differential Privacy (RDP). We establish, for the first time, an exact theoretical connection between Gaussian random projection and RDP, proposing a tighter privacy budget analysis framework. Methodologically, we employ refined moment-generating function analysis to significantly tighten existing RDP bounds and derive verifiable utility lower bounds. Theoretically, we prove that this mechanism simultaneously achieves strict RDP guarantees and statistically optimal utility across multiple linear regression settings—including ordinary least squares and ridge regression. Empirically, on several benchmark datasets, our approach improves model accuracy by 5–12%, reduces runtime by over 20% in certain scenarios, and compresses the privacy budget by 30–50% compared to prior work.
📝 Abstract
Gaussian sketching, which consists of pre-multiplying the data with a random Gaussian matrix, is a widely used technique for multiple problems in data science and machine learning, with applications spanning computationally efficient optimization, coded computing, and federated learning. This operation also provides differential privacy guarantees due to its inherent randomness. In this work, we revisit this operation through the lens of Renyi Differential Privacy (RDP), providing a refined privacy analysis that yields significantly tighter bounds than prior results. We then demonstrate how this improved analysis leads to performance improvement in different linear regression settings, establishing theoretical utility guarantees. Empirically, our methods improve performance across multiple datasets and, in several cases, reduce runtime.