🤖 AI Summary
Current multivariate time series forecasting (MTSF) research predominantly relies on end-to-end holistic evaluation, obscuring the individual contributions of model components and hindering precise identification of performance bottlenecks. To address this, we propose a fine-grained analytical framework that systematically disentangles core model modules—including sequence patching, channel independence, attention mechanisms, time-series foundation models, and large language models—and quantifies their isolated impacts. Furthermore, we introduce TSGym, an automated framework enabling component-level composable modeling and cross-dataset transfer optimization, thereby overcoming limitations of fixed architectures and manual hyperparameter tuning. Extensive experiments across multiple benchmarks demonstrate that TSGym significantly outperforms state-of-the-art forecasting and AutoML methods, particularly exhibiting superior generalization under distributional shift. The code is publicly available.
📝 Abstract
Recently, deep learning has driven significant advancements in multivariate time series forecasting (MTSF) tasks. However, much of the current research in MTSF tends to evaluate models from a holistic perspective, which obscures the individual contributions and leaves critical issues unaddressed. Adhering to the current modeling paradigms, this work bridges these gaps by systematically decomposing deep MTSF methods into their core, fine-grained components like series-patching tokenization, channel-independent strategy, attention modules, or even Large Language Models and Time-series Foundation Models. Through extensive experiments and component-level analysis, our work offers more profound insights than previous benchmarks that typically discuss models as a whole.
Furthermore, we propose a novel automated solution called TSGym for MTSF tasks. Unlike traditional hyperparameter tuning, neural architecture searching or fixed model selection, TSGym performs fine-grained component selection and automated model construction, which enables the creation of more effective solutions tailored to diverse time series data, therefore enhancing model transferability across different data sources and robustness against distribution shifts. Extensive experiments indicate that TSGym significantly outperforms existing state-of-the-art MTSF and AutoML methods. All code is publicly available on https://github.com/SUFE-AILAB/TSGym.