🤖 AI Summary
This paper addresses the load time-series downscaling problem—generating high-resolution load forecasts from low-resolution inputs. We propose an enhanced RNN architecture that integrates explicit Fourier-based seasonal embedding with a novel intra-cycle self-attention mechanism operating on high-resolution temporal components. Crucially, our model dynamically fuses explicit Fourier seasonal features with latent-space representations, enabling more accurate and stable long-horizon forecasting. Evaluated on PJM’s four major regions, the proposed model achieves significantly lower RMSE than Prophet (with and without seasonal components and LAA) and ablation baselines (RNN without attention or without the Fourier module). Moreover, prediction error increases more gradually with forecast horizon, demonstrating superior robustness and generalization capability. To the best of our knowledge, this is the first work to incorporate intra-cycle self-attention over high-resolution sub-periods within a Fourier-augmented RNN framework for load downscaling.
📝 Abstract
We present a Fourier-enhanced recurrent neural network (RNN) for downscaling electrical loads. The model combines (i) a recurrent backbone driven by low-resolution inputs, (ii) explicit Fourier seasonal embeddings fused in latent space, and (iii) a self-attention layer that captures dependencies among high-resolution components within each period. Across four PJM territories, the approach yields RMSE lower and flatter horizon-wise than classical Prophet baselines (with and without seasonality/LAA) and than RNN ablations without attention or Fourier features.