π€ AI Summary
In multivariate time series forecasting, Transformers often underperform relative to MLPs due to unstable inter-channel dependencies. To address this, we propose EmbedArmorβa lightweight, plug-in embedding enhancement module that systematically introduces three inductive biases tailored for temporal modeling: global stability, phase sensitivity, and cross-axis specificity. Integrated seamlessly into standard Transformer architectures, EmbedArmor employs auxiliary embeddings jointly optimized with an MLP to enable channel-specific representation learning, while incorporating phase-aware temporal alignment and global feature enrichment. Evaluated on 12 real-world benchmarks, EmbedArmor consistently outperforms state-of-the-art methods, achieving average reductions of 2.73% in MSE and 5.15% in MAE. The approach significantly improves the robustness and practicality of Transformers for multivariate time series forecasting without architectural overhauls or substantial computational overhead.
π Abstract
Multivariate time series forecasting is crucial across a wide range of domains. While presenting notable progress for the Transformer architecture, iTransformer still lags behind the latest MLP-based models. We attribute this performance gap to unstable inter-channel relationships. To bridge this gap, we propose EMAformer, a simple yet effective model that enhances the Transformer with an auxiliary embedding suite, akin to armor that reinforces its ability. By introducing three key inductive biases, i.e., extit{global stability}, extit{phase sensitivity}, and extit{cross-axis specificity}, EMAformer unlocks the further potential of the Transformer architecture, achieving state-of-the-art performance on 12 real-world benchmarks and reducing forecasting errors by an average of 2.73% in MSE and 5.15% in MAE. This significantly advances the practical applicability of Transformer-based approaches for multivariate time series forecasting. The code is available on https://github.com/PlanckChang/EMAformer.