Information Contraction under $(\varepsilon,\delta)$-Differentially Private Mechanisms

๐Ÿ“… 2026-01-23
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the limitation of existing information contraction bounds, which are largely confined to $(\varepsilon,0)$-local differential privacy (LDP) and thus fail to characterize the information loss in $(\varepsilon,\delta)$-LDP mechanisms with $\delta > 0$. By leveraging the mathematical structure of differential privacy together with strong data processing inequalities from information theory, this paper establishes the first linear and nonlinear information contraction bounds for $(\varepsilon,\delta)$-LDP that hold for any $\delta \geq 0$. The derived general strong data processing inequality applies to both hockey-stick divergence and general $f$-divergences, thereby overcoming the prior restriction to $\delta = 0$. Moreover, it yields tighter theoretical guarantees under various information metrics, including total variation distance, significantly extending and improving upon existing results.

Technology Category

Application Category

๐Ÿ“ Abstract
The distinguishability quantified by information measures after being processed by a private mechanism has been a useful tool in studying various statistical and operational tasks while ensuring privacy. To this end, standard data-processing inequalities and strong data-processing inequalities (SDPI) are employed. Most of the previously known and even tight characterizations of contraction of information measures, including total variation distance, hockey-stick divergences, and $f$-divergences, are applicable for $(\varepsilon,0)$-local differential private (LDP) mechanisms. In this work, we derive both linear and non-linear strong data-processing inequalities for hockey-stick divergence and $f$-divergences that are valid for all $(\varepsilon,\delta)$-LDP mechanisms even when $\delta \neq 0$. Our results either generalize or improve the previously known bounds on the contraction of these distinguishability measures.
Problem

Research questions and friction points this paper is trying to address.

differential privacy
information contraction
strong data-processing inequality
hockey-stick divergence
f-divergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

strong data-processing inequalities
hockey-stick divergence
f-divergences
local differential privacy
information contraction
๐Ÿ”Ž Similar Papers
No similar papers found.