Adaptive Batch Size for Privately Finding Second-Order Stationary Points

📅 2024-10-10
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
The theoretical feasibility and computational cost of computing second-order stationary points (SOSPs) under differential privacy remain unclear; prior work by Ganesh et al. (2023) on α-SOSP suffers from flawed saddle-point escape analysis. Method: We propose the first ε-differentially private algorithm for SOSPs, built upon the SpiderBoost framework and integrating adaptive batch sampling with a binary-tree mechanism to privately estimate gradients and Hessian-vector products. Contribution/Results: We establish that the algorithm achieves α-accuracy with α = O(1/n^{1/3} + (√d/(nε))^{1/2}) using total gradient complexity O(n^{5/3}), matching the optimal accuracy of differentially private first-order stationary point (FOSP) algorithms. This is the first result demonstrating that SOSPs can be computed at *no additional privacy cost* over FOSPs under standard privacy budgets—thereby fully closing the privacy optimization gap between first- and second-order stationarity.

Technology Category

Application Category

📝 Abstract
There is a gap between finding a first-order stationary point (FOSP) and a second-order stationary point (SOSP) under differential privacy constraints, and it remains unclear whether privately finding an SOSP is more challenging than finding an FOSP. Specifically, Ganesh et al. (2023) claimed that an $alpha$-SOSP can be found with $alpha=O(frac{1}{n^{1/3}}+(frac{sqrt{d}}{nepsilon})^{3/7})$, where $n$ is the dataset size, $d$ is the dimension, and $epsilon$ is the differential privacy parameter. However, a recent analysis revealed an issue in their saddle point escape procedure, leading to weaker guarantees. Building on the SpiderBoost algorithm framework, we propose a new approach that uses adaptive batch sizes and incorporates the binary tree mechanism. Our method not only corrects this issue but also improves the results for privately finding an SOSP, achieving $alpha=O(frac{1}{n^{1/3}}+(frac{sqrt{d}}{nepsilon})^{1/2})$. This improved bound matches the state-of-the-art for finding a FOSP, suggesting that privately finding an SOSP may be achievable at no additional cost.
Problem

Research questions and friction points this paper is trying to address.

Addresses gap between FOSP and SOSP
Corrects issues in saddle point escape
Improves bounds for private SOSP
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive batch size
Binary tree mechanism
SpiderBoost algorithm framework
🔎 Similar Papers
No similar papers found.