Disentangling Long-Short Term State Under Unknown Interventions for Online Time Series Forecasting

📅 2025-02-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Online time-series forecasting faces the challenge of simultaneously preserving long-term dependencies and adapting to short-term dynamics—particularly under non-stationary, abrupt-change streaming conditions. To address this, we propose the first intervention-aware, identifiable disentanglement framework for long- and short-horizon latent states. Our method introduces a dual-encoder architecture—one capturing persistent trends and the other modeling transient intervention responses—augmented by joint smoothness and interruption-dependency constraints, integrated with causal representation learning and online recursive state updating. Crucially, it explicitly decouples long-term trends from short-term intervention effects. Evaluated across multiple benchmark datasets, our approach achieves significant improvements over state-of-the-art methods, demonstrating superior robustness to distributional shifts and enhanced prediction accuracy under non-stationarity and abrupt changes.

Technology Category

Application Category

📝 Abstract
Current methods for time series forecasting struggle in the online scenario, since it is difficult to preserve long-term dependency while adapting short-term changes when data are arriving sequentially. Although some recent methods solve this problem by controlling the updates of latent states, they cannot disentangle the long/short-term states, leading to the inability to effectively adapt to nonstationary. To tackle this challenge, we propose a general framework to disentangle long/short-term states for online time series forecasting. Our idea is inspired by the observations where short-term changes can be led by unknown interventions like abrupt policies in the stock market. Based on this insight, we formalize a data generation process with unknown interventions on short-term states. Under mild assumptions, we further leverage the independence of short-term states led by unknown interventions to establish the identification theory to achieve the disentanglement of long/short-term states. Built on this theory, we develop a long short-term disentanglement model (LSTD) to extract the long/short-term states with long/short-term encoders, respectively. Furthermore, the LSTD model incorporates a smooth constraint to preserve the long-term dependencies and an interrupted dependency constraint to enforce the forgetting of short-term dependencies, together boosting the disentanglement of long/short-term states. Experimental results on several benchmark datasets show that our extbf{LSTD} model outperforms existing methods for online time series forecasting, validating its efficacy in real-world applications.
Problem

Research questions and friction points this paper is trying to address.

Disentangling long-short term states
Handling unknown interventions
Improving online time series forecasting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Disentangles long/short-term states
Leverages unknown interventions
Incorporates smooth and interrupted constraints
Ruichu Cai
Ruichu Cai
Professor of Computer Science, Guangdong University of Technology
causality
H
Haiqin Huang
School of Computer Science, Guangdong University of Technology, China
Z
Zhifang Jiang
School of Computer Science, Guangdong University of Technology, China
Z
Zijian Li
Machine Learning Department, Mohamed bin Zayed University of Artificial Intelligence, United Arab Emirates
C
Changze Zhou
School of Computer Science, Guangdong University of Technology, China
Y
Yuequn Liu
School of Computer Science, Guangdong University of Technology, China
Yuming Liu
Yuming Liu
University of Wisconsin at Madison
biophotonicsquantitative biomedical image analysisMonte Carlo simulation of light transport in biological tissue
Zhifeng Hao
Zhifeng Hao
Shantou University