Nonparametric Filtering, Estimation and Classification using Neural Jump ODEs

📅 2024-12-04
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of filtering, estimation, and classification for irregularly sampled and partially observed online time-series data, this paper proposes the Neural Jump ODE framework. It extends Jump ODEs—previously limited to autonomous systems—to general input-output dynamical systems, enabling nonparametric modeling that jointly captures discontinuous (jump) dynamics, learns conditional expectations nonparametrically, and supports online recursive updates. Theoretically, we prove that the framework achieves optimal L² filtering without assuming specific parametric forms or strong regularity conditions on latent distributions—overcoming key limitations of classical parametric models. Empirically, Neural Jump ODE significantly outperforms Kalman filters, particle filters, and RNN/LSTM classifiers on complex non-Gaussian and multimodal latent processes, while maintaining real-time inference capability. Its efficacy is validated in low-latency critical applications, including financial trading signal prediction and wearable health monitoring.

Technology Category

Application Category

📝 Abstract
Neural Jump ODEs model the conditional expectation between observations by neural ODEs and jump at arrival of new observations. They have demonstrated effectiveness for fully data-driven online forecasting in settings with irregular and partial observations, operating under weak regularity assumptions. This work extends the framework to input-output systems, enabling direct applications in online filtering and classification. We establish theoretical convergence guarantees for this approach, providing a robust solution to $L^2$-optimal filtering. Empirical experiments highlight the model's superior performance over classical parametric methods, particularly in scenarios with complex underlying distributions. These results emphasise the approach's potential in time-sensitive domains such as finance and health monitoring, where real-time accuracy is crucial.
Problem

Research questions and friction points this paper is trying to address.

Develops neural ODEs with jumps for irregular time series data
Extends framework to enable real-time filtering and classification tasks
Provides theoretical convergence guarantees for L²-optimal filtering solutions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural ODEs model conditional expectations between observations
Extends framework to input-output systems for filtering
Establishes theoretical convergence for L^2-optimal filtering
🔎 Similar Papers
No similar papers found.
Jakob Heiss
Jakob Heiss
Postdoc, UC Berkeley
Deep Neural NetworksMachine LearningAIMathematics
Florian Krach
Florian Krach
Doctoral Student in Mathematics, ETH Zürich
mathematical financestochastic calculusneural networksmachine learningstatistics
T
Thorsten Schmidt
Department of Mathematics, Albert-Ludwigs-Universität Freiburg
F
Félix B. Tambe-Ndonfack
Department of Mathematics, Albert-Ludwigs-Universität Freiburg