Differentially Private Manifold Denoising

📅 2026-04-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of denoising query points on a manifold while preserving the privacy of sensitive reference data to support downstream tasks. The work proposes the first differentially private manifold denoising framework, which iteratively estimates local means and tangent-space geometry, then projects query points onto a private subspace satisfying $(\varepsilon, \delta)$-differential privacy toward the local mean to recover the underlying manifold structure. Under standard manifold regularity assumptions, the method provides non-asymptotic utility guarantees and incorporates a modular privacy budget scheduling mechanism. Experimental results demonstrate that the approach effectively recovers signal under moderate privacy budgets, achieving a favorable trade-off between privacy and utility, thereby offering a deployable privacy-preserving component for manifold learning in regulated settings.
📝 Abstract
We introduce a differentially private manifold denoising framework that allows users to exploit sensitive reference datasets to correct noisy, non-private query points without compromising privacy. The method follows an iterative procedure that (i) privately estimates local means and tangent geometry using the reference data under calibrated sensitivity, (ii) projects query points along the privately estimated subspace toward the local mean via corrective steps at each iteration, and (iii) performs rigorous privacy accounting across iterations and queries using $(\varepsilon,δ)$-differential privacy (DP). Conceptually, this framework brings differential privacy to manifold methods, retaining sufficient geometric signal for downstream tasks such as embedding, clustering, and visualization, while providing formal DP guarantees for the reference data. Practically, the procedure is modular and scalable, separating DP-protected local geometry (means and tangents) from budgeted query-point updates, with a simple scheduler allocating privacy budget across iterations and queries. Under standard assumptions on manifold regularity, sampling density, and measurement noise, we establish high-probability utility guarantees showing that corrected queries converge toward the manifold at a non-asymptotic rate governed by sample size, noise level, bandwidth, and the privacy budget. Simulations and case studies demonstrate accurate signal recovery under moderate privacy budgets, illustrating clear utility-privacy trade-offs and providing a deployable DP component for manifold-based workflows in regulated environments without reengineering privacy systems.
Problem

Research questions and friction points this paper is trying to address.

differential privacy
manifold denoising
privacy-preserving
sensitive data
geometric signal
Innovation

Methods, ideas, or system contributions that make the work stand out.

differentially private
manifold denoising
local tangent estimation
privacy budget scheduling
non-asymptotic utility guarantee
🔎 Similar Papers
No similar papers found.
J
Jiaqi Wu
Department of Statistics and Data Science, National University of Singapore
Y
Yiqing Sun
Department of Statistics and Data Science, National University of Singapore
Zhigang Yao
Zhigang Yao
National University of Singapore
Interface of Statistics and GeometryStatisticsMachine Learning