Leveraging Multivariate Long-Term History Representation for Time Series Forecasting

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing multivariate time series (MTS) forecasting methods predominantly capture short-term, local spatiotemporal dependencies while neglecting long-horizon cross-variable similarities and correlations, thereby limiting prediction accuracy. To address this, we propose the Long-horizon Multivariate Historical Representation (LMHR) framework, which introduces two novel components: a parameter-free Hierarchical Retriever (HRetriever) and a ranking-position-encoded Transformer Aggregator (TAggregator), enabling noise-robust, spatially aware, and sparse-efficient historical information fusion. Furthermore, we design a new Spatiotemporal Graph Neural Network (STGNN) architecture integrating a Long-horizon Historical Encoder (LHEncoder). Extensive experiments on multiple real-world benchmarks demonstrate that our method reduces average forecasting error by 10.72% over representative STGNNs and by 4.12% over state-of-the-art approaches, while improving abrupt-pattern prediction accuracy by 9.8%.

Technology Category

Application Category

📝 Abstract
Multivariate Time Series (MTS) forecasting has a wide range of applications in both industry and academia. Recent advances in Spatial-Temporal Graph Neural Network (STGNN) have achieved great progress in modelling spatial-temporal correlations. Limited by computational complexity, most STGNNs for MTS forecasting focus primarily on short-term and local spatial-temporal dependencies. Although some recent methods attempt to incorporate univariate history into modeling, they still overlook crucial long-term spatial-temporal similarities and correlations across MTS, which are essential for accurate forecasting. To fill this gap, we propose a framework called the Long-term Multivariate History Representation (LMHR) Enhanced STGNN for MTS forecasting. Specifically, a Long-term History Encoder (LHEncoder) is adopted to effectively encode the long-term history into segment-level contextual representations and reduce point-level noise. A non-parametric Hierarchical Representation Retriever (HRetriever) is designed to include the spatial information in the long-term spatial-temporal dependency modelling and pick out the most valuable representations with no additional training. A Transformer-based Aggregator (TAggregator) selectively fuses the sparsely retrieved contextual representations based on the ranking positional embedding efficiently. Experimental results demonstrate that LMHR outperforms typical STGNNs by 10.72% on the average prediction horizons and state-of-the-art methods by 4.12% on several real-world datasets. Additionally, it consistently improves prediction accuracy by 9.8% on the top 10% of rapidly changing patterns across the datasets.
Problem

Research questions and friction points this paper is trying to address.

Addresses limited long-term spatial-temporal dependencies in MTS forecasting
Enhances STGNNs with long-term multivariate history representation
Improves accuracy for rapidly changing patterns in time series
Innovation

Methods, ideas, or system contributions that make the work stand out.

Long-term History Encoder for segment-level representations
Non-parametric Hierarchical Representation Retriever for spatial info
Transformer-based Aggregator fuses sparse contextual representations
🔎 Similar Papers
No similar papers found.