A Comprehensive Survey of Time Series Forecasting: Architectural Diversity and Open Challenges

📅 2024-10-24
🏛️ arXiv.org
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
Existing deep learning architectures for time series forecasting—ranging from MLPs, CNNs, RNNs, GNNs, Transformers, and Mamba to diffusion models and foundation models—face persistent challenges in modeling channel dependencies, ensuring distributional shift robustness, enabling causal inference, and achieving feature disentanglement. Method: We propose an “architectural renaissance” perspective, empirically revealing the surprising competitiveness of linear models across diverse scenarios; establish a unified evaluation framework grounded in intrinsic data characteristics, systematically mapping architecture choices to task properties for the first time; and integrate hybrid modeling, generative modeling, and mechanism-driven approaches into a cross-paradigm unification pathway. Contribution/Results: Our work delivers an open problem inventory, standardized evaluation dimensions, and low-barrier practical guidelines—shifting time series intelligence from heuristic “architecture stacking” toward principled “mechanism-aware adaptation.”

Technology Category

Application Category

📝 Abstract
Time series forecasting is a critical task that provides key information for decision-making across various fields. Recently, various fundamental deep learning architectures such as MLPs, CNNs, RNNs, and GNNs have been developed and applied to solve time series forecasting problems. However, the structural limitations caused by the inductive biases of each deep learning architecture constrained their performance. Transformer models, which excel at handling long-term dependencies, have become significant architectural components for time series forecasting. However, recent research has shown that alternatives such as simple linear layers can outperform Transformers. These findings have opened up new possibilities for using diverse architectures. In this context of exploration into various models, the architectural modeling of time series forecasting has now entered a renaissance. This survey not only provides a historical context for time series forecasting but also offers comprehensive and timely analysis of the movement toward architectural diversification. By comparing and re-examining various deep learning models, we uncover new perspectives and presents the latest trends in time series forecasting, including the emergence of hybrid models, diffusion models, Mamba models, and foundation models. By focusing on the inherent characteristics of time series data, we also address open challenges that have gained attention in time series forecasting, such as channel dependency, distribution shift, causality, and feature extraction. This survey explores vital elements that can enhance forecasting performance through diverse approaches. These contributions lead to lowering the entry barriers for newcomers to the field of time series forecasting, while also offering seasoned researchers broad perspectives, new opportunities, and deep insights.
Problem

Research questions and friction points this paper is trying to address.

Exploring diverse architectures for time series forecasting performance
Addressing structural limitations in deep learning models for forecasting
Analyzing challenges like channel dependency and distribution shift
Innovation

Methods, ideas, or system contributions that make the work stand out.

Utilizes Transformer models for long-term dependencies
Explores simple linear layers as Transformer alternatives
Investigates hybrid and diffusion models for forecasting
🔎 Similar Papers
No similar papers found.
J
Jongseon Kim
Interdisciplinary Program in Artificial Intelligence, Seoul National University; R&D Department, LG Chem
H
Hyungjoon Kim
Interdisciplinary Program in Artificial Intelligence, Seoul National University; R&D Department, Samsung SDI
HyunGi Kim
HyunGi Kim
Seoul National University
Deep LearningAnomaly DetectionTime-seriesBioinformatics
D
Dongjun Lee
Interdisciplinary Program in Artificial Intelligence, Seoul National University
Sungroh Yoon
Sungroh Yoon
Professor, Electrical and Computer Engineering & Artificial Intelligence, Seoul National University
AIdeep learningmachine learningon-device AIbioinformatics