TSGym: Design Choices for Deep Multivariate Time-Series Forecasting

📅 2025-09-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current multivariate time series forecasting (MTSF) research predominantly relies on end-to-end holistic evaluation, obscuring the individual contributions of model components and hindering precise identification of performance bottlenecks. To address this, we propose a fine-grained analytical framework that systematically disentangles core model modules—including sequence patching, channel independence, attention mechanisms, time-series foundation models, and large language models—and quantifies their isolated impacts. Furthermore, we introduce TSGym, an automated framework enabling component-level composable modeling and cross-dataset transfer optimization, thereby overcoming limitations of fixed architectures and manual hyperparameter tuning. Extensive experiments across multiple benchmarks demonstrate that TSGym significantly outperforms state-of-the-art forecasting and AutoML methods, particularly exhibiting superior generalization under distributional shift. The code is publicly available.

Technology Category

Application Category

📝 Abstract
Recently, deep learning has driven significant advancements in multivariate time series forecasting (MTSF) tasks. However, much of the current research in MTSF tends to evaluate models from a holistic perspective, which obscures the individual contributions and leaves critical issues unaddressed. Adhering to the current modeling paradigms, this work bridges these gaps by systematically decomposing deep MTSF methods into their core, fine-grained components like series-patching tokenization, channel-independent strategy, attention modules, or even Large Language Models and Time-series Foundation Models. Through extensive experiments and component-level analysis, our work offers more profound insights than previous benchmarks that typically discuss models as a whole. Furthermore, we propose a novel automated solution called TSGym for MTSF tasks. Unlike traditional hyperparameter tuning, neural architecture searching or fixed model selection, TSGym performs fine-grained component selection and automated model construction, which enables the creation of more effective solutions tailored to diverse time series data, therefore enhancing model transferability across different data sources and robustness against distribution shifts. Extensive experiments indicate that TSGym significantly outperforms existing state-of-the-art MTSF and AutoML methods. All code is publicly available on https://github.com/SUFE-AILAB/TSGym.
Problem

Research questions and friction points this paper is trying to address.

Systematically decomposes deep multivariate forecasting into core components
Addresses limitations of holistic model evaluation in time series analysis
Automates fine-grained component selection for improved model transferability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decomposes MTSF models into fine-grained components
Proposes TSGym for automated component selection
Enables tailored model construction for diverse data
🔎 Similar Papers
No similar papers found.
S
Shuang Liang
AI Lab, Shanghai University of Finance and Economics
C
Chaochuan Hou
AI Lab, Shanghai University of Finance and Economics
X
Xu Yao
AI Lab, Shanghai University of Finance and Economics
Shiping Wang
Shiping Wang
Fuzhou University
machine learningexplainable deep learningmulti-view learninggraph neural network
M
Minqi Jiang
AI Lab, Shanghai University of Finance and Economics
Songqiao Han
Songqiao Han
Shanghai University of Finance and Economics
NLPknowledge graphAnomaly detection
H
Hailiang Huang
AI Lab, Shanghai University of Finance and Economics; MoE Key Laboratory of Interdisciplinary Research of Computation and Economics