On Purely Private Covariance Estimation

📅 2025-10-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates optimal estimation of high-dimensional covariance matrices under pure differential privacy. Addressing distinct sample-size regimes—namely, $n gtrless d^2/varepsilon$—we propose a simple matrix perturbation mechanism based on projection onto a nuclear-norm ball. Our method achieves state-of-the-art error bounds under all $p$-Schatten norms ($p geq 1$). Notably, it attains the information-theoretically optimal rate $O(d/sqrt{nvarepsilon})$ in spectral norm—the first such result. In Frobenius norm, it achieves provably optimal $O(sqrt{d}/sqrt{nvarepsilon})$ for large $n$, and for small $n$, improves the error from $O(sqrt{d/n})$ to $O(sqrt{d,mathrm{Tr}(Sigma)/(nvarepsilon)})$, explicitly leveraging the trace of the true covariance $Sigma$. The approach unifies tools from matrix analysis and pure differential privacy theory, balancing algorithmic simplicity with theoretical tightness.

Technology Category

Application Category

📝 Abstract
We present a simple perturbation mechanism for the release of $d$-dimensional covariance matrices $Σ$ under pure differential privacy. For large datasets with at least $ngeq d^2/varepsilon$ elements, our mechanism recovers the provably optimal Frobenius norm error guarantees of cite{nikolov2023private}, while simultaneously achieving best known error for all other $p$-Schatten norms, with $pin [1,infty]$. Our error is information-theoretically optimal for all $pge 2$, in particular, our mechanism is the first purely private covariance estimator that achieves optimal error in spectral norm. For small datasets $n< d^2/varepsilon$, we further show that by projecting the output onto the nuclear norm ball of appropriate radius, our algorithm achieves the optimal Frobenius norm error $O(sqrt{d; ext{Tr}(Σ) /n})$, improving over the known bounds of $O(sqrt{d/n})$ of cite{nikolov2023private} and ${O}ig(d^{3/4}sqrt{ ext{Tr}(Σ)/n}ig)$ of cite{dong2022differentially}.
Problem

Research questions and friction points this paper is trying to address.

Estimating covariance matrices under pure differential privacy constraints
Achieving optimal error bounds in Frobenius and spectral norms
Improving accuracy for small datasets via nuclear norm projection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Simple perturbation mechanism for covariance matrices
Achieves optimal error in spectral norm
Projects output onto nuclear norm ball
🔎 Similar Papers
No similar papers found.
Tommaso d'Orsi
Tommaso d'Orsi
Bocconi
AlgorithmsComputational ComplexityMachine LearningPrivacy
G
Gleb Novikov
Lucerne School of Computer Science and Information Technology, Switzerland.