Fourier-Enhanced Recurrent Neural Networks for Electrical Load Time Series Downscaling

📅 2025-11-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the load time-series downscaling problem—generating high-resolution load forecasts from low-resolution inputs. We propose an enhanced RNN architecture that integrates explicit Fourier-based seasonal embedding with a novel intra-cycle self-attention mechanism operating on high-resolution temporal components. Crucially, our model dynamically fuses explicit Fourier seasonal features with latent-space representations, enabling more accurate and stable long-horizon forecasting. Evaluated on PJM’s four major regions, the proposed model achieves significantly lower RMSE than Prophet (with and without seasonal components and LAA) and ablation baselines (RNN without attention or without the Fourier module). Moreover, prediction error increases more gradually with forecast horizon, demonstrating superior robustness and generalization capability. To the best of our knowledge, this is the first work to incorporate intra-cycle self-attention over high-resolution sub-periods within a Fourier-augmented RNN framework for load downscaling.

Technology Category

Application Category

📝 Abstract
We present a Fourier-enhanced recurrent neural network (RNN) for downscaling electrical loads. The model combines (i) a recurrent backbone driven by low-resolution inputs, (ii) explicit Fourier seasonal embeddings fused in latent space, and (iii) a self-attention layer that captures dependencies among high-resolution components within each period. Across four PJM territories, the approach yields RMSE lower and flatter horizon-wise than classical Prophet baselines (with and without seasonality/LAA) and than RNN ablations without attention or Fourier features.
Problem

Research questions and friction points this paper is trying to address.

Downscales electrical load time series to higher resolution
Integrates Fourier seasonal embeddings with recurrent neural networks
Improves accuracy over baseline methods in multiple regions
Innovation

Methods, ideas, or system contributions that make the work stand out.

RNN backbone with low-resolution inputs
Fourier seasonal embeddings in latent space
Self-attention for high-resolution dependencies
🔎 Similar Papers
No similar papers found.