🤖 AI Summary
This work addresses the challenge of probabilistic forecasting for irregularly sampled time series by proposing UFO, a novel architecture that uniquely integrates U-Net, Transformer, and Neural Controlled Differential Equations (Neural CDEs). The U-Net component enables parallel multi-scale local feature extraction, the Transformer captures global temporal dependencies, and the Neural CDE models continuous-time dynamics. Crucially, the entire model preserves full causality while supporting efficient parallel computation. Evaluated on five standard benchmarks, UFO substantially outperforms ten state-of-the-art methods, achieving up to a 15-fold speedup in inference time, with particularly strong performance on long sequences and high-dimensional multivariate settings.
📝 Abstract
Probabilistic forecasting of irregularly sampled time series is crucial in domains such as healthcare and finance, yet it remains a formidable challenge. Existing Neural Controlled Differential Equation (Neural CDE) approaches, while effective at modelling continuous dynamics, suffer from slow, inherently sequential computation, which restricts scalability and limits access to global context. We introduce UFO (U-Former ODE), a novel architecture that seamlessly integrates the parallelizable, multiscale feature extraction of U-Nets, the powerful global modelling of Transformers, and the continuous-time dynamics of Neural CDEs. By constructing a fully causal, parallelizable model, UFO achieves a global receptive field while retaining strong sensitivity to local temporal dynamics. Extensive experiments on five standard benchmarks -- covering both regularly and irregularly sampled time series -- demonstrate that UFO consistently outperforms ten state-of-the-art neural baselines in predictive accuracy. Moreover, UFO delivers up to 15$\times$ faster inference compared to conventional Neural CDEs, with consistently strong performance on long and highly multivariate sequences.