EMAformer: Enhancing Transformer through Embedding Armor for Time Series Forecasting

πŸ“… 2025-11-11
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In multivariate time series forecasting, Transformers often underperform relative to MLPs due to unstable inter-channel dependencies. To address this, we propose EmbedArmorβ€”a lightweight, plug-in embedding enhancement module that systematically introduces three inductive biases tailored for temporal modeling: global stability, phase sensitivity, and cross-axis specificity. Integrated seamlessly into standard Transformer architectures, EmbedArmor employs auxiliary embeddings jointly optimized with an MLP to enable channel-specific representation learning, while incorporating phase-aware temporal alignment and global feature enrichment. Evaluated on 12 real-world benchmarks, EmbedArmor consistently outperforms state-of-the-art methods, achieving average reductions of 2.73% in MSE and 5.15% in MAE. The approach significantly improves the robustness and practicality of Transformers for multivariate time series forecasting without architectural overhauls or substantial computational overhead.

Technology Category

Application Category

πŸ“ Abstract
Multivariate time series forecasting is crucial across a wide range of domains. While presenting notable progress for the Transformer architecture, iTransformer still lags behind the latest MLP-based models. We attribute this performance gap to unstable inter-channel relationships. To bridge this gap, we propose EMAformer, a simple yet effective model that enhances the Transformer with an auxiliary embedding suite, akin to armor that reinforces its ability. By introducing three key inductive biases, i.e., extit{global stability}, extit{phase sensitivity}, and extit{cross-axis specificity}, EMAformer unlocks the further potential of the Transformer architecture, achieving state-of-the-art performance on 12 real-world benchmarks and reducing forecasting errors by an average of 2.73% in MSE and 5.15% in MAE. This significantly advances the practical applicability of Transformer-based approaches for multivariate time series forecasting. The code is available on https://github.com/PlanckChang/EMAformer.
Problem

Research questions and friction points this paper is trying to address.

Enhancing Transformer for multivariate time series forecasting
Addressing unstable inter-channel relationships in time series
Improving Transformer performance against MLP-based models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Enhances Transformer with auxiliary embedding suite
Introduces global stability inductive bias
Adds phase sensitivity and cross-axis specificity
Z
Zhiwei Zhang
School of Computer Science and Technology, Beijing Jiaotong University, Beijing, China
X
Xinyi Du
Beijing Normal University, Beijing, China
X
Xuanchi Guo
School of Computer Science and Technology, Beijing Jiaotong University, Beijing, China
W
Weihao Wang
School of Computer Science and Technology, Beijing Jiaotong University, Beijing, China
Wenjuan Han
Wenjuan Han
Beijing Jiaotong Univerisity
Natural Language ProcessingMachine LearningArtificial IntelligenceGrammar Induction