🤖 AI Summary
Online time-series forecasting faces the challenge of simultaneously preserving long-term dependencies and adapting to short-term dynamics—particularly under non-stationary, abrupt-change streaming conditions. To address this, we propose the first intervention-aware, identifiable disentanglement framework for long- and short-horizon latent states. Our method introduces a dual-encoder architecture—one capturing persistent trends and the other modeling transient intervention responses—augmented by joint smoothness and interruption-dependency constraints, integrated with causal representation learning and online recursive state updating. Crucially, it explicitly decouples long-term trends from short-term intervention effects. Evaluated across multiple benchmark datasets, our approach achieves significant improvements over state-of-the-art methods, demonstrating superior robustness to distributional shifts and enhanced prediction accuracy under non-stationarity and abrupt changes.
📝 Abstract
Current methods for time series forecasting struggle in the online scenario, since it is difficult to preserve long-term dependency while adapting short-term changes when data are arriving sequentially. Although some recent methods solve this problem by controlling the updates of latent states, they cannot disentangle the long/short-term states, leading to the inability to effectively adapt to nonstationary. To tackle this challenge, we propose a general framework to disentangle long/short-term states for online time series forecasting. Our idea is inspired by the observations where short-term changes can be led by unknown interventions like abrupt policies in the stock market. Based on this insight, we formalize a data generation process with unknown interventions on short-term states. Under mild assumptions, we further leverage the independence of short-term states led by unknown interventions to establish the identification theory to achieve the disentanglement of long/short-term states. Built on this theory, we develop a long short-term disentanglement model (LSTD) to extract the long/short-term states with long/short-term encoders, respectively. Furthermore, the LSTD model incorporates a smooth constraint to preserve the long-term dependencies and an interrupted dependency constraint to enforce the forgetting of short-term dependencies, together boosting the disentanglement of long/short-term states. Experimental results on several benchmark datasets show that our extbf{LSTD} model outperforms existing methods for online time series forecasting, validating its efficacy in real-world applications.