Near-Optimal Private Linear Regression via Iterative Hessian Mixing

📅 2026-01-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of efficiently solving ordinary least squares regression under differential privacy constraints. The authors propose a novel algorithm that integrates Gaussian random projection with an iterative Hessian sketch, coupled with a carefully designed privacy mechanism to achieve strong differential privacy guarantees while significantly improving estimation accuracy. By overcoming the limitations of conventional Gaussian sketching approaches, the method demonstrates superior theoretical and empirical performance compared to state-of-the-art alternatives such as AdaSSP. Extensive experiments on multiple standard benchmark datasets confirm that the proposed algorithm achieves nearly optimal theoretical guarantees alongside exceptional practical performance.

Technology Category

Application Category

📝 Abstract
We study differentially private ordinary least squares (DP-OLS) with bounded data. The dominant approach, adaptive sufficient-statistics perturbation (AdaSSP), adds an adaptively chosen perturbation to the sufficient statistics, namely, the matrix $X^{\top}X$ and the vector $X^{\top}Y$, and is known to achieve near-optimal accuracy and to have strong empirical performance. In contrast, methods that rely on Gaussian-sketching, which ensure differential privacy by pre-multiplying the data with a random Gaussian matrix, are widely used in federated and distributed regression, yet remain relatively uncommon for DP-OLS. In this work, we introduce the iterative Hessian mixing, a novel DP-OLS algorithm that relies on Gaussian sketches and is inspired by the iterative Hessian sketch algorithm. We provide utility analysis for the iterative Hessian mixing as well as a new analysis for the previous methods that rely on Gaussian sketches. Then, we show that our new approach circumvents the intrinsic limitations of the prior methods and provides non-trivial improvements over AdaSSP. We conclude by running an extensive set of experiments across standard benchmarks to demonstrate further that our approach consistently outperforms these prior baselines.
Problem

Research questions and friction points this paper is trying to address.

differentially private linear regression
ordinary least squares
Gaussian sketching
sufficient statistics perturbation
privacy-accuracy tradeoff
Innovation

Methods, ideas, or system contributions that make the work stand out.

differentially private regression
Gaussian sketching
iterative Hessian mixing
ordinary least squares
privacy-preserving machine learning
🔎 Similar Papers
No similar papers found.