Variability Aware Recursive Neural Network (VARNN): A Residual-Memory Model for Capturing Temporal Deviation in Sequence Regression Modeling

📅 2025-10-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-world time series often exhibit non-stationarity, regime shifts, and heteroscedasticity, severely undermining the robustness of standard regression models. To address this, we propose the Residual-Aware Recurrent Neural Network (RARNN), which introduces a learnable error memory state to explicitly model and exploit recent prediction residuals as dynamic bias and volatility signals for real-time forecast calibration. RARNN integrates a feedforward predictor with a residual-driven memory update mechanism, enabling adaptive forecasting conditioned on short-term residual sequences. Evaluated across diverse domains—including household energy consumption, healthcare monitoring, and environmental sensing—RARNN consistently outperforms static, dynamic, and state-of-the-art recurrent baselines, achieving an average 12.6% reduction in test MSE. Crucially, it attains this performance with minimal parameter count and negligible inference overhead, offering an exceptional balance of accuracy and computational efficiency.

Technology Category

Application Category

📝 Abstract
Real-world time series data exhibit non-stationary behavior, regime shifts, and temporally varying noise (heteroscedastic) that degrade the robustness of standard regression models. We introduce the Variability-Aware Recursive Neural Network (VARNN), a novel residual-aware architecture for supervised time-series regression that learns an explicit error memory from recent prediction residuals and uses it to recalibrate subsequent predictions. VARNN augments a feed-forward predictor with a learned error-memory state that is updated from residuals over a short context steps as a signal of variability and drift, and then conditions the final prediction at the current time step. Across diverse dataset domains, appliance energy, healthcare, and environmental monitoring, experimental results demonstrate VARNN achieves superior performance and attains lower test MSE with minimal computational overhead over static, dynamic, and recurrent baselines. Our findings show that the VARNN model offers robust predictions under a drift and volatility environment, highlighting its potential as a promising framework for time-series learning.
Problem

Research questions and friction points this paper is trying to address.

Modeling non-stationary time series with regime shifts
Handling heteroscedastic noise in sequence regression
Capturing temporal deviation using residual memory states
Innovation

Methods, ideas, or system contributions that make the work stand out.

VARNN uses residual memory for temporal deviation modeling
It recalibrates predictions with learned error-memory state
Model achieves robust performance under drift and volatility
🔎 Similar Papers
No similar papers found.
H
Haroon Gharwi
Department of Computer Science, Illinois Institute of Technology, Chicago, IL, USA
Kai Shu
Kai Shu
Assistant Professor of Computer Science, Emory University
Data MiningTrustworthy AISocial ComputingMachine LearningAI Safety