Differentially Private Quasi-Concave Optimization: Bypassing the Lower Bound and Application to Geometric Problems

📅 2025-04-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the sample complexity of differentially private quasiconcave optimization, aiming to break the exponential lower bound Ω(2^{log^*|X|}) established by Cohen et al. for natural geometric problems—including center-point selection and PAC learning of d-dimensional halfspaces. To this end, we introduce the notion of *approximate quasiconcavity* and construct the first general differentially private optimization framework tailored to geometric structures. Our theoretical analysis shows that the sample complexity improves dramatically—from exponential 2^{log^*|X|} to doubly logarithmic log^*|X|—in terms of input domain size |X|. Specifically, we achieve an upper bound of Õ(d^{5.5}·log^*|X|), improving upon the prior best Õ(d^{2.5}·2^{log^*|X|}). This yields the first log^*-optimal solution for high-dimensional private geometric learning, establishing a new paradigm at the intersection of quasiconcave optimization and privacy-preserving machine learning.

Technology Category

Application Category

📝 Abstract
We study the sample complexity of differentially private optimization of quasi-concave functions. For a fixed input domain $mathcal{X}$, Cohen et al. (STOC 2023) proved that any generic private optimizer for low sensitive quasi-concave functions must have sample complexity $Omega(2^{log^*|mathcal{X}|})$. We show that the lower bound can be bypassed for a series of ``natural'' problems. We define a new class of emph{approximated} quasi-concave functions, and present a generic differentially private optimizer for approximated quasi-concave functions with sample complexity $ ilde{O}(log^*|mathcal{X}|)$. As applications, we use our optimizer to privately select a center point of points in $d$ dimensions and emph{probably approximately correct} (PAC) learn $d$-dimensional halfspaces. In previous works, Bun et al. (FOCS 2015) proved a lower bound of $Omega(log^*|mathcal{X}|)$ for both problems. Beimel et al. (COLT 2019) and Kaplan et al. (NeurIPS 2020) gave an upper bound of $ ilde{O}(d^{2.5}cdot 2^{log^*|mathcal{X}|})$ for the two problems, respectively. We improve the dependency of the upper bounds on the cardinality of the domain by presenting a new upper bound of $ ilde{O}(d^{5.5}cdotlog^*|mathcal{X}|)$ for both problems. To the best of our understanding, this is the first work to reduce the sample complexity dependency on $|mathcal{X}|$ for these two problems from exponential in $log^* |mathcal{X}|$ to $log^* |mathcal{X}|$.
Problem

Research questions and friction points this paper is trying to address.

Bypass lower bound for differentially private quasi-concave optimization
Optimize approximated quasi-concave functions with reduced sample complexity
Improve sample complexity for private center point selection and PAC learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Defined approximated quasi-concave functions class
Developed private optimizer with reduced complexity
Improved domain dependency in upper bounds
🔎 Similar Papers
No similar papers found.