🤖 AI Summary
Existing deep learning models struggle to effectively capture strong periodic structures in long-term time series forecasting. To address this, we propose a spectrum-guided forecasting framework: first, dominant periodic frequencies are extracted via joint application of the Fast Fourier Transform (FFT) and coordinate descent; second, positional embeddings are initialized with sinusoidal functions, and a dual-speed learning mechanism is introduced into the Transformer architecture to explicitly preserve critical low-frequency components during training. This approach ensures both interpretability and robustness, circumventing the weak periodic modeling inherent in conventional end-to-end learning. Extensive experiments on real-world benchmarks—including ETT and Weather—demonstrate significant improvements in long-horizon prediction accuracy (96–192 steps). Moreover, on synthetic data, our method accurately recovers ground-truth frequencies, validating its fidelity to spectral characteristics and theoretical consistency in frequency-domain modeling.
📝 Abstract
Many real-world time series exhibit strong periodic structures arising from physical laws, human routines, or seasonal cycles. However, modern deep forecasting models often fail to capture these recurring patterns due to spectral bias and a lack of frequency-aware inductive priors. Motivated by this gap, we propose a simple yet effective method that enhances long-term forecasting by explicitly modeling periodicity through spectral initialization and frequency-constrained optimization. Specifically, we extract dominant low-frequency components via Fast Fourier Transform (FFT)-guided coordinate descent, initialize sinusoidal embeddings with these components, and employ a two-speed learning schedule to preserve meaningful frequency structure during training. Our approach is model-agnostic and integrates seamlessly into existing Transformer-based architectures. Extensive experiments across diverse real-world benchmarks demonstrate consistent performance gains--particularly at long horizons--highlighting the benefits of injecting spectral priors into deep temporal models for robust and interpretable long-range forecasting. Moreover, on synthetic data, our method accurately recovers ground-truth frequencies, further validating its interpretability and effectiveness in capturing latent periodic patterns.