Locally Private Parametric Methods for Change-Point Detection

📅 2026-02-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the statistical and algorithmic foundations of change-point detection in time series under local differential privacy (LDP) constraints. Focusing on parametric distributional shifts, it introduces a non-private generalized log-likelihood ratio test to enhance finite-sample accuracy and designs an LDP-compliant algorithm combining randomized response with a binary mechanism. The work establishes the first characterization of the statistical cost of LDP in change-point detection, deriving tight detection error bounds for both private and non-private settings and empirically quantifying the performance loss due to privacy. A key theoretical contribution is the proof that the strong data processing inequality (SDPI) constants for Rényi divergence and its symmetrized variant are attained by binary input distributions, offering independent value to information theory and statistical estimation.

Technology Category

Application Category

📝 Abstract
We study parametric change-point detection, where the goal is to identify distributional changes in time series, under local differential privacy. In the non-private setting, we derive improved finite-sample accuracy guarantees for a change-point detection algorithm based on the generalized log-likelihood ratio test, via martingale methods. In the private setting, we propose two locally differentially private algorithms based on randomized response and binary mechanisms, and analyze their theoretical performance. We derive bounds on detection accuracy and validate our results through empirical evaluation. Our results characterize the statistical cost of local differential privacy in change-point detection and show how privacy degrades performance relative to a non-private benchmark. As part of this analysis, we establish a structural result for strong data processing inequalities (SDPI), proving that SDPI coefficients for R\'enyi divergences and their symmetric variants (Jeffreys-R\'enyi divergences) are achieved by binary input distributions. These results on SDPI coefficients are also of independent interest, with applications to statistical estimation, data compression, and Markov chain mixing.
Problem

Research questions and friction points this paper is trying to address.

change-point detection
local differential privacy
time series
distributional changes
parametric methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

local differential privacy
change-point detection
strong data processing inequality
Rényi divergence
martingale methods
A
Anuj Kumar Yadav
School of Computer and Communication Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Vaud, Switzerland
C
Cemre Cadir
School of Computer and Communication Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Vaud, Switzerland
Yanina Shkel
Yanina Shkel
EPFL
Information TheoryLearning TheoryCoding Theory
Michael Gastpar
Michael Gastpar
Professor, Ecole Polytechnique Fédérale (EPFL), Switzerland
Information TheorySignal ProcessingNeuroscience