🤖 AI Summary
This work addresses causal discovery in conditionally stationary time series—processes that are stationary given latent states but non-stationary overall. We propose State-Dependent Causal Inference (SDCI), the first framework to provably ensure unique identifiability of state-dependent causal graphs under mild, verifiable conditions. SDCI integrates latent-variable modeling, nonlinear temporal dynamics, and variational inference, with theoretical guarantees from identifiability theory ensuring reliable causal structure estimation. Empirically, SDCI achieves significant improvements over existing baselines on real-world tasks including particle dynamics simulation and gene regulatory network inference. Moreover, on NBA player trajectory prediction—a challenging sequential forecasting task—its causal modeling paradigm outperforms non-causal RNNs, demonstrating both effectiveness and strong generalization across domains.
📝 Abstract
Causal discovery, i.e., inferring underlying causal relationships from observational data, is highly challenging for AI systems. In a time series modeling context, traditional causal discovery methods mainly consider constrained scenarios with fully observed variables and/or data from stationary time-series. We develop a causal discovery approach to handle a wide class of nonstationary time series that are conditionally stationary, where the nonstationary behaviour is modeled as stationarity conditioned on a set of latent state variables. Named State-Dependent Causal Inference (SDCI), our approach is able to recover the underlying causal dependencies, with provable identifiablity for the state-dependent causal structures. Empirical experiments on nonlinear particle interaction data and gene regulatory networks demonstrate SDCI's superior performance over baseline causal discovery methods. Improved results over non-causal RNNs on modeling NBA player movements demonstrate the potential of our method and motivate the use of causality-driven methods for forecasting.