🤖 AI Summary
This study investigates the statistical and algorithmic foundations of change-point detection in time series under local differential privacy (LDP) constraints. Focusing on parametric distributional shifts, it introduces a non-private generalized log-likelihood ratio test to enhance finite-sample accuracy and designs an LDP-compliant algorithm combining randomized response with a binary mechanism. The work establishes the first characterization of the statistical cost of LDP in change-point detection, deriving tight detection error bounds for both private and non-private settings and empirically quantifying the performance loss due to privacy. A key theoretical contribution is the proof that the strong data processing inequality (SDPI) constants for Rényi divergence and its symmetrized variant are attained by binary input distributions, offering independent value to information theory and statistical estimation.
📝 Abstract
We study parametric change-point detection, where the goal is to identify distributional changes in time series, under local differential privacy. In the non-private setting, we derive improved finite-sample accuracy guarantees for a change-point detection algorithm based on the generalized log-likelihood ratio test, via martingale methods. In the private setting, we propose two locally differentially private algorithms based on randomized response and binary mechanisms, and analyze their theoretical performance. We derive bounds on detection accuracy and validate our results through empirical evaluation. Our results characterize the statistical cost of local differential privacy in change-point detection and show how privacy degrades performance relative to a non-private benchmark. As part of this analysis, we establish a structural result for strong data processing inequalities (SDPI), proving that SDPI coefficients for R\'enyi divergences and their symmetric variants (Jeffreys-R\'enyi divergences) are achieved by binary input distributions. These results on SDPI coefficients are also of independent interest, with applications to statistical estimation, data compression, and Markov chain mixing.