Neural MJD: Neural Non-Stationary Merton Jump Diffusion for Time Series Prediction

📅 2025-06-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address poor generalization and inadequate stochastic process modeling in deep learning for nonstationary time series—particularly those with abrupt change points—this paper proposes a neuralized nonstationary Merton jump-diffusion framework. It formulates forecasting as solving a stochastic differential equation (SDE) featuring time-varying Itô diffusion and time-varying compound Poisson jumps. We introduce, for the first time, a jump truncation mechanism with rigorous error-bound guarantees, and design a restart-based Euler–Maruyama numerical solver that significantly reduces both the expected error and variance of state estimation. All components are parameterized by neural networks, enabling end-to-end differentiable training. Extensive experiments on multiple synthetic and real-world benchmarks demonstrate consistent superiority over state-of-the-art deep learning and statistical models, validating the framework’s enhanced capability to model nonstationarity and abrupt structural changes, along with its strong generalization performance.

Technology Category

Application Category

📝 Abstract
While deep learning methods have achieved strong performance in time series prediction, their black-box nature and inability to explicitly model underlying stochastic processes often limit their generalization to non-stationary data, especially in the presence of abrupt changes. In this work, we introduce Neural MJD, a neural network based non-stationary Merton jump diffusion (MJD) model. Our model explicitly formulates forecasting as a stochastic differential equation (SDE) simulation problem, combining a time-inhomogeneous It^o diffusion to capture non-stationary stochastic dynamics with a time-inhomogeneous compound Poisson process to model abrupt jumps. To enable tractable learning, we introduce a likelihood truncation mechanism that caps the number of jumps within small time intervals and provide a theoretical error bound for this approximation. Additionally, we propose an Euler-Maruyama with restart solver, which achieves a provably lower error bound in estimating expected states and reduced variance compared to the standard solver. Experiments on both synthetic and real-world datasets demonstrate that Neural MJD consistently outperforms state-of-the-art deep learning and statistical learning methods.
Problem

Research questions and friction points this paper is trying to address.

Modeling non-stationary time series with abrupt changes
Combining stochastic dynamics and jump processes for forecasting
Improving accuracy and reducing variance in SDE simulations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural network based non-stationary MJD model
Likelihood truncation for tractable jump learning
Euler-Maruyama with restart solver
🔎 Similar Papers
No similar papers found.