Neural Stochastic Flows: Solver-Free Modelling and Inference for SDE Solutions

📅 2025-10-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional numerical solvers for stochastic differential equations (SDEs) suffer from high computational cost and inability to support efficient arbitrary-time sampling—critical challenges in modeling irregular time series. To address this, we propose Neural Stochastic Flow (NSF), the first method that tightly integrates conditional normalizing flows with the transition properties of SDEs. NSF employs a structurally constrained flow architecture, latent-variable modeling, and rigorous preservation of stochastic flow mathematics to enable explicit, one-shot analytical sampling of SDE state transitions—bypassing numerical integration entirely. Evaluated on synthetic, financial, and physical datasets, NSF achieves over 100× speedup (≈2 orders of magnitude) versus classical solvers (e.g., Euler–Maruyama) for large time intervals, while maintaining comparable distributional fidelity. Our core contribution is the first learnable SDE transition operator that simultaneously ensures theoretical soundness and practical efficiency.

Technology Category

Application Category

📝 Abstract
Stochastic differential equations (SDEs) are well suited to modelling noisy and irregularly sampled time series found in finance, physics, and machine learning. Traditional approaches require costly numerical solvers to sample between arbitrary time points. We introduce Neural Stochastic Flows (NSFs) and their latent variants, which directly learn (latent) SDE transition laws using conditional normalising flows with architectural constraints that preserve properties inherited from stochastic flows. This enables one-shot sampling between arbitrary states and yields up to two orders of magnitude speed-ups at large time gaps. Experiments on synthetic SDE simulations and on real-world tracking and video data show that NSFs maintain distributional accuracy comparable to numerical approaches while dramatically reducing computation for arbitrary time-point sampling.
Problem

Research questions and friction points this paper is trying to address.

Modeling irregular time series with SDEs
Eliminating costly numerical solvers for sampling
Enabling efficient arbitrary time-point sampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural Stochastic Flows learn SDE transition laws
Conditional normalising flows enable one-shot sampling
Preserves distributional accuracy with computational speed-ups
🔎 Similar Papers
No similar papers found.