🤖 AI Summary
Existing hyperparameter optimization (HPO) pipelines for Transformer-based time series forecasting models suffer from fragmentation, tight architectural coupling, and poor reusability. To address these challenges, this paper introduces TS-HPO—the first general-purpose, plug-and-play HPO framework for time series forecasting. TS-HPO decouples optimization logic from model implementation, enabling seamless support for diverse architectures including Transformers, Mamba, and TimeMixer. It establishes a cross-architecture transferable tuning paradigm, accompanied by empirically validated guidelines, and integrates Bayesian optimization with early stopping to form an end-to-end automated pipeline. Extensive evaluation across standard benchmarks—ETT, Weather, and Traffic—demonstrates an average 12.3% reduction in MAE. The framework’s source code, configuration templates, and comprehensive experimental results are publicly released and have been widely adopted in both industry and academia.
📝 Abstract
Transformer-based models for time series forecasting (TSF) have attracted significant attention in recent years due to their effectiveness and versatility. However, these models often require extensive hyperparameter optimization (HPO) to achieve the best possible performance, and a unified pipeline for HPO in transformer-based TSF remains lacking. In this paper, we present one such pipeline and conduct extensive experiments on several state-of-the-art (SOTA) transformer-based TSF models. These experiments are conducted on standard benchmark datasets to evaluate and compare the performance of different models, generating practical insights and examples. Our pipeline is generalizable beyond transformer-based architectures and can be applied to other SOTA models, such as Mamba and TimeMixer, as demonstrated in our experiments. The goal of this work is to provide valuable guidance to both industry practitioners and academic researchers in efficiently identifying optimal hyperparameters suited to their specific domain applications. The code and complete experimental results are available on GitHub.