🤖 AI Summary
Real-world time series often exhibit non-stationarity, regime shifts, and heteroscedasticity, severely undermining the robustness of standard regression models. To address this, we propose the Residual-Aware Recurrent Neural Network (RARNN), which introduces a learnable error memory state to explicitly model and exploit recent prediction residuals as dynamic bias and volatility signals for real-time forecast calibration. RARNN integrates a feedforward predictor with a residual-driven memory update mechanism, enabling adaptive forecasting conditioned on short-term residual sequences. Evaluated across diverse domains—including household energy consumption, healthcare monitoring, and environmental sensing—RARNN consistently outperforms static, dynamic, and state-of-the-art recurrent baselines, achieving an average 12.6% reduction in test MSE. Crucially, it attains this performance with minimal parameter count and negligible inference overhead, offering an exceptional balance of accuracy and computational efficiency.
📝 Abstract
Real-world time series data exhibit non-stationary behavior, regime shifts, and temporally varying noise (heteroscedastic) that degrade the robustness of standard regression models. We introduce the Variability-Aware Recursive Neural Network (VARNN), a novel residual-aware architecture for supervised time-series regression that learns an explicit error memory from recent prediction residuals and uses it to recalibrate subsequent predictions. VARNN augments a feed-forward predictor with a learned error-memory state that is updated from residuals over a short context steps as a signal of variability and drift, and then conditions the final prediction at the current time step. Across diverse dataset domains, appliance energy, healthcare, and environmental monitoring, experimental results demonstrate VARNN achieves superior performance and attains lower test MSE with minimal computational overhead over static, dynamic, and recurrent baselines. Our findings show that the VARNN model offers robust predictions under a drift and volatility environment, highlighting its potential as a promising framework for time-series learning.