Still Competitive: Revisiting Recurrent Models for Irregular Time Series Prediction

📅 2025-10-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Forecasting irregularly sampled multivariate time series remains challenging, where complex models offer unclear advantages and traditional RNNs remain underexploited. Method: We propose GRUwE—a GRU variant incorporating a dual reset mechanism: (i) a learnable exponentially decaying time-triggered reset and (ii) an observation-triggered reset—enabling Markovian state updates in continuous time to explicitly model asynchronous and sparse observations. This design balances expressiveness and efficiency, avoiding costly ODE solvers or attention mechanisms. Contribution/Results: GRUwE achieves state-of-the-art (SOTA) forecasting accuracy across multiple real-world benchmarks, with an average 3.2× speedup in inference latency and 40% fewer parameters compared to leading baselines. It thus delivers high performance, low computational overhead, and practical deployability.

Technology Category

Application Category

📝 Abstract
Modeling irregularly sampled multivariate time series is a persistent challenge in domains like healthcare and sensor networks. While recent works have explored a variety of complex learning architectures to solve the prediction problems for irregularly sampled time series, it remains unclear what are the true benefits of some of these architectures, and whether clever modifications of simpler and more efficient RNN-based algorithms are still competitive, i.e. they are on par with or even superior to these methods. In this work, we propose and study GRUwE: Gated Recurrent Unit with Exponential basis functions, that builds upon RNN-based architectures for observations made at irregular times. GRUwE supports both regression-based and event-based predictions in continuous time. GRUwE works by maintaining a Markov state representation of the time series that updates with the arrival of irregular observations. The Markov state update relies on two reset mechanisms: (i) observation-triggered reset, and (ii) time-triggered reset of the GRU state using learnable exponential decays, to support the predictions in continuous time. Our empirical evaluations across several real-world benchmarks on next-observation and next-event prediction tasks demonstrate that GRUwE can indeed achieve competitive to superior performance compared to the recent state-of-the-art (SOTA) methods. Thanks to its simplicity, GRUwE offers compelling advantages: it is easy to implement, requires minimal hyper-parameter tuning efforts, and significantly reduces the computational overhead in the online deployment.
Problem

Research questions and friction points this paper is trying to address.

Modeling irregularly sampled multivariate time series in healthcare and sensor networks
Evaluating whether simpler RNN-based models can compete with complex architectures
Developing continuous-time prediction methods for irregular observations using exponential decays
Innovation

Methods, ideas, or system contributions that make the work stand out.

GRUwE uses exponential basis functions for updates
It employs observation and time-triggered reset mechanisms
The model supports continuous-time regression and event predictions
🔎 Similar Papers
No similar papers found.