🤖 AI Summary
This work addresses the challenge of balancing computational efficiency and noise robustness in long-horizon multivariate time series forecasting. While Transformers suffer from high computational complexity and linear state-space models struggle to suppress high-frequency noise effectively, the proposed ASGMamba framework integrates a Mamba backbone with a lightweight adaptive spectral gating (ASG) mechanism to dynamically filter local frequency-domain noise. Furthermore, it incorporates a hierarchical multi-scale architecture and variable-specific node embeddings to capture heterogeneous dynamics across multiple variables with linear complexity O(L). Evaluated on nine benchmark datasets, ASGMamba achieves state-of-the-art performance while significantly reducing memory consumption for long-horizon predictions, making it well-suited for high-performance applications under resource constraints.
📝 Abstract
Long-term multivariate time series forecasting (LTSF) plays a crucial role in various high-performance computing applications, including real-time energy grid management and large-scale traffic flow simulation. However, existing solutions face a dilemma: Transformer-based models suffer from quadratic complexity, limiting their scalability on long sequences, while linear State Space Models (SSMs) often struggle to distinguish valuable signals from high-frequency noise, leading to wasted state capacity. To bridge this gap, we propose ASGMamba, an efficient forecasting framework designed for resource-constrained supercomputing environments. ASGMamba integrates a lightweight Adaptive Spectral Gating (ASG) mechanism that dynamically filters noise based on local spectral energy, enabling the Mamba backbone to focus its state evolution on robust temporal dynamics. Furthermore, we introduce a hierarchical multi-scale architecture with variable-specific Node Embeddings to capture diverse physical characteristics. Extensive experiments on nine benchmarks demonstrate that ASGMamba achieves state-of-the-art accuracy. While keeping strictly $$\mathcal{O}(L)$$ complexity we significantly reduce the memory usage on long-horizon tasks, thus establishing ASGMamba as a scalable solution for high-throughput forecasting in resource limited environments.The code is available at https://github.com/hit636/ASGMamba