DeformTime: Capturing Variable Dependencies with Deformable Attention for Time Series Forecasting

📅 2024-06-11
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing autoregressive models for multivariate time series forecasting neglect exogenous variables and struggle to capture dynamic inter-variable and temporal dependencies. To address this, we propose DeformTS, an end-to-end neural framework. Its core innovation is a novel dual-path deformable attention mechanism—comprising Variable-wise Deformable Attention Blocks (DABs) and Temporal DABs—that enables sparse, adaptive modeling of cross-variable and cross-time-step dependencies. A dedicated input transformation is further introduced to enhance learning of deformation patterns. Crucially, DeformTS explicitly incorporates exogenous variables, departing from conventional autoregressive paradigms. Evaluated on six standard benchmarks and a real-world infectious disease modeling task, DeformTS achieves an average 7.2% reduction in MAE over state-of-the-art methods, with consistent and substantial improvements in long-horizon forecasting performance.

Technology Category

Application Category

📝 Abstract
In multivariable time series (MTS) forecasting, existing state-of-the-art deep learning approaches tend to focus on autoregressive formulations and often overlook the potential of using exogenous variables in enhancing the prediction of the target endogenous variable. To address this limitation, we present DeformTime, a neural network architecture that attempts to capture correlated temporal patterns from the input space, and hence, improve forecasting accuracy. It deploys two core operations performed by deformable attention blocks (DABs): learning dependencies across variables from different time steps (variable DAB), and preserving temporal dependencies in data from previous time steps (temporal DAB). Input data transformation is explicitly designed to enhance learning from the deformed series of information while passing through a DAB. We conduct extensive experiments on 6 MTS data sets, using previously established benchmarks as well as challenging infectious disease modelling tasks with more exogenous variables. The results demonstrate that DeformTime improves accuracy against previous competitive methods across the vast majority of MTS forecasting tasks, reducing the mean absolute error by 7.2% on average. Notably, performance gains remain consistent across longer forecasting horizons.
Problem

Research questions and friction points this paper is trying to address.

Capturing correlated temporal patterns in multivariable time series forecasting
Enhancing prediction accuracy using exogenous variables
Addressing limitations of autoregressive approaches in time series analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deformable attention blocks capture variable dependencies
Input transformation enhances deformed series learning
Dual DABs improve temporal and variable patterns
🔎 Similar Papers
No similar papers found.