Frequency-Constrained Learning for Long-Term Forecasting

📅 2025-08-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing deep learning models struggle to effectively capture strong periodic structures in long-term time series forecasting. To address this, we propose a spectrum-guided forecasting framework: first, dominant periodic frequencies are extracted via joint application of the Fast Fourier Transform (FFT) and coordinate descent; second, positional embeddings are initialized with sinusoidal functions, and a dual-speed learning mechanism is introduced into the Transformer architecture to explicitly preserve critical low-frequency components during training. This approach ensures both interpretability and robustness, circumventing the weak periodic modeling inherent in conventional end-to-end learning. Extensive experiments on real-world benchmarks—including ETT and Weather—demonstrate significant improvements in long-horizon prediction accuracy (96–192 steps). Moreover, on synthetic data, our method accurately recovers ground-truth frequencies, validating its fidelity to spectral characteristics and theoretical consistency in frequency-domain modeling.

Technology Category

Application Category

📝 Abstract
Many real-world time series exhibit strong periodic structures arising from physical laws, human routines, or seasonal cycles. However, modern deep forecasting models often fail to capture these recurring patterns due to spectral bias and a lack of frequency-aware inductive priors. Motivated by this gap, we propose a simple yet effective method that enhances long-term forecasting by explicitly modeling periodicity through spectral initialization and frequency-constrained optimization. Specifically, we extract dominant low-frequency components via Fast Fourier Transform (FFT)-guided coordinate descent, initialize sinusoidal embeddings with these components, and employ a two-speed learning schedule to preserve meaningful frequency structure during training. Our approach is model-agnostic and integrates seamlessly into existing Transformer-based architectures. Extensive experiments across diverse real-world benchmarks demonstrate consistent performance gains--particularly at long horizons--highlighting the benefits of injecting spectral priors into deep temporal models for robust and interpretable long-range forecasting. Moreover, on synthetic data, our method accurately recovers ground-truth frequencies, further validating its interpretability and effectiveness in capturing latent periodic patterns.
Problem

Research questions and friction points this paper is trying to address.

Enhance long-term forecasting by modeling periodicity explicitly
Address spectral bias in deep forecasting models
Improve frequency-aware inductive priors for time series
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectral initialization for periodic patterns
Frequency-constrained optimization technique
Two-speed learning schedule preservation
🔎 Similar Papers
No similar papers found.
Menglin Kong
Menglin Kong
Ph.D. Student, McGill University
Spatiotemporal ModelingUncertainty QuantificationProbabilistic Machine Learning
V
Vincent Zhihao Zheng
Department of Civil Engineering, McGill University
L
Lijun Sun
Department of Civil Engineering, McGill University