🤖 AI Summary
Addressing the inherent tension between data visibility and privacy protection in mobile device privacy settings, this paper tackles involuntary privacy leakage arising from statistical dependencies among users. We propose the first data-correlation-aware local redaction mechanism. Specifically, we introduce a novel metric—“data-correlation privacy leakage”—and develop a Markov correlation model to characterize dependency structures in user data. Our approach integrates information-theoretic leakage analysis with adaptive local perturbation design. Theoretically, we prove that our mechanism strictly surpasses the utility upper bound of data-agnostic redaction under identical privacy budgets. Empirically, it significantly improves data utility while preserving privacy. Our core contribution is the first incorporation of data dependency modeling into the local redaction framework, enabling simultaneous optimization of both informativeness and protection. This advances the state of the art in privacy-preserving data publishing for resource-constrained edge environments.
📝 Abstract
When users make personal privacy choices, correlation between their data can cause inadvertent leakage about users who do not want to share their data by other users sharing their data. As a solution, we consider local redaction mechanisms. As prior works proposed data-independent privatization mechanisms, we study the family of data-independent local redaction mechanisms and upper-bound their utility when data correlation is modeled by a stationary Markov process. In contrast, we derive a novel data-dependent mechanism, which improves the utility by leveraging a data-dependent leakage measure.