N-BEATS-MOE: N-BEATS with a Mixture-of-Experts Layer for Heterogeneous Time Series Forecasting

📅 2025-08-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited adaptability of single-model approaches in heterogeneous time-series forecasting, this paper proposes MoE-N-BEATS: an extension of the interpretable N-BEATS framework that incorporates a Mixture-of-Experts (MoE) layer with a learnable gating network for block-level dynamic expert selection and weighting. This design preserves N-BEATS’ intrinsic trend/seasonality decomposition structure and interpretability, while enabling the gating weights to explicitly reflect each time series’ dependency on distinct expert components. Extensive experiments across 12 benchmark datasets—including highly heterogeneous scenarios—demonstrate that MoE-N-BEATS consistently and significantly outperforms the original N-BEATS and multiple state-of-the-art baselines. The results validate the effectiveness and generalizability of the proposed dynamic expert adaptation mechanism in enhancing forecasting accuracy for heterogeneous time series.

Technology Category

Application Category

📝 Abstract
Deep learning approaches are increasingly relevant for time series forecasting tasks. Methods such as N-BEATS, which is built on stacks of multilayer perceptrons (MLPs) blocks, have achieved state-of-the-art results on benchmark datasets and competitions. N-BEATS is also more interpretable relative to other deep learning approaches, as it decomposes forecasts into different time series components, such as trend and seasonality. In this work, we present N-BEATS-MOE, an extension of N-BEATS based on a Mixture-of-Experts (MoE) layer. N-BEATS-MOE employs a dynamic block weighting strategy based on a gating network which allows the model to better adapt to the characteristics of each time series. We also hypothesize that the gating mechanism provides additional interpretability by identifying which expert is most relevant for each series. We evaluate our method across 12 benchmark datasets against several approaches, achieving consistent improvements on several datasets, especially those composed of heterogeneous time series.
Problem

Research questions and friction points this paper is trying to address.

Extends N-BEATS for heterogeneous time series forecasting
Improves adaptability via Mixture-of-Experts dynamic weighting
Enhances interpretability by identifying relevant experts per series
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends N-BEATS with Mixture-of-Experts layer
Uses dynamic block weighting strategy
Improves interpretability via expert gating
🔎 Similar Papers
No similar papers found.
R
Ricardo Matos
Faculdade de Engenharia da Universidade do Porto, Porto, Portugal
L
Luis Roque
Faculdade de Engenharia da Universidade do Porto, Porto, Portugal; Laboratory for Artificial Intelligence and Computer Science (LIACC), Portugal
Vitor Cerqueira
Vitor Cerqueira
University of Porto, Faculty of Engineering
Machine learningTime series