Causal Discovery-Driven Change Point Detection in Time Series

📅 2024-07-10
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the problem of accurately detecting distributional change points in target variables of multivariate time series that arise from shifts in underlying causal mechanisms. We propose a two-stage nonparametric framework: first, inferring local causal structure—specifically, the direct causes of the target variable—via constraint-based causal discovery (e.g., the PC algorithm); second, detecting mechanism-level distributional changes in the target variable conditioned on its causal parents using the conditional relative Pearson divergence. To our knowledge, this is the first method to jointly integrate causal discovery with change-point detection, relaxing the conventional i.i.d. assumption via the causal Markov condition to enable precise, mechanism-specific identification of generative process shifts. Experiments on synthetic and real-world datasets demonstrate that our approach significantly improves detection accuracy for target-variable change points while substantially reducing false positives induced by irrelevant variables.

Technology Category

Application Category

📝 Abstract
Change point detection in time series seeks to identify times when the probability distribution of time series changes. It is widely applied in many areas, such as human-activity sensing and medical science. In the context of multivariate time series, this typically involves examining the joint distribution of high-dimensional data: If any one variable changes, the whole time series is assumed to have changed. However, in practical applications, we may be interested only in certain components of the time series, exploring abrupt changes in their distributions in the presence of other time series. Here, assuming an underlying structural causal model that governs the time-series data generation, we address this problem by proposing a two-stage non-parametric algorithm that first learns parts of the causal structure through constraint-based discovery methods. The algorithm then uses conditional relative Pearson divergence estimation to identify the change points. The conditional relative Pearson divergence quantifies the distribution disparity between consecutive segments in the time series, while the causal discovery method enables a focus on the causal mechanism, facilitating access to independent and identically distributed (IID) samples. Theoretically, the typical assumption of samples being IID in conventional change point detection methods can be relaxed based on the Causal Markov Condition. Through experiments on both synthetic and real-world datasets, we validate the correctness and utility of our approach.
Problem

Research questions and friction points this paper is trying to address.

Detect distribution shifts in multivariate time series components
Focus on causal mechanisms using constraint-based discovery methods
Identify change points via conditional relative Pearson divergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-stage non-parametric algorithm for detection
Constraint-based causal structure learning
Conditional relative Pearson divergence estimation
🔎 Similar Papers
No similar papers found.