🤖 AI Summary
Traditional ARIMA models face challenges in end-to-end multi-step forecasting for long-term time series, particularly due to their inability to jointly model autoregressive (AR) and moving average (MA) components and their reliance on handcrafted position encodings. To address this, we propose ARMA-CNN—a lightweight dual-branch convolutional module that explicitly decouples AR and MA dynamics into separate convolutional pathways: one capturing global trend evolution and the other modeling local fluctuations. Crucially, absolute positional information is inherently encoded via causal convolutional receptive fields, eliminating the need for explicit position embeddings or iterative decoding. The architecture natively supports both univariate and multivariate forecasting, offering structural simplicity and strong interpretability. Evaluated on nine benchmark datasets, ARMA-CNN achieves competitive accuracy against state-of-the-art deep models—especially under abrupt trend shifts—while maintaining high computational efficiency and robustness.
📝 Abstract
This paper proposes a simple yet effective convolutional module for long-term time series forecasting. The proposed block, inspired by the Auto-Regressive Integrated Moving Average (ARIMA) model, consists of two convolutional components: one for capturing the trend (autoregression) and the other for refining local variations (moving average). Unlike conventional ARIMA, which requires iterative multi-step forecasting, the block directly performs multi-step forecasting, making it easily extendable to multivariate settings. Experiments on nine widely used benchmark datasets demonstrate that our method ARMA achieves competitive accuracy, particularly on datasets exhibiting strong trend variations, while maintaining architectural simplicity. Furthermore, analysis shows that the block inherently encodes absolute positional information, suggesting its potential as a lightweight replacement for positional embeddings in sequential models.