🤖 AI Summary
To address the challenges of multi-scale pattern coupling, concept drift, and data scarcity in load forecasting for complex dynamic systems (e.g., cloud computing, power grids, transportation), this paper proposes a novel time-series forecasting paradigm grounded in *meta-patterns*—fundamental waveform primitives. We design an end-to-end interpretable Transformer architecture featuring two innovations: (i) *meta-pattern pooling*, which extracts and denoises salient waveforms from raw sequences; and (ii) an *Echo mechanism*, enabling adaptive reconstruction to model their temporal evolution under concept drift. Evaluated across eight public benchmarks and three industrial deployment scenarios, our method consistently outperforms 15 state-of-the-art models, reducing average relative error by 37%. Crucially, it achieves, for the first time, *intrinsic waveform-level interpretability*—directly linking predictions to physically meaningful meta-patterns—while demonstrating strong cross-scenario generalization without retraining.
📝 Abstract
Time series forecasting is a critical and practical problem in many real-world applications, especially for industrial scenarios, where load forecasting underpins the intelligent operation of modern systems like clouds, power grids and traffic networks.However, the inherent complexity and dynamics of these systems present significant challenges. Despite advances in methods such as pattern recognition and anti-non-stationarity have led to performance gains, current methods fail to consistently ensure effectiveness across various system scenarios due to the intertwined issues of complex patterns, concept-drift, and few-shot problems. To address these challenges simultaneously, we introduce a novel scheme centered on fundamental waveform, a.k.a., meta-pattern. Specifically, we develop a unique Meta-pattern Pooling mechanism to purify and maintain meta-patterns, capturing the nuanced nature of system loads. Complementing this, the proposed Echo mechanism adaptively leverages the meta-patterns, enabling a flexible and precise pattern reconstruction. Our Meta-pattern Echo transformer (MetaEformer) seamlessly incorporates these mechanisms with the transformer-based predictor, offering end-to-end efficiency and interpretability of core processes. Demonstrating superior performance across eight benchmarks under three system scenarios, MetaEformer marks a significant advantage in accuracy, with a 37% relative improvement on fifteen state-of-the-art baselines.