Non-Parametric Learning of Stochastic Differential Equations with Non-asymptotic Fast Rates of Convergence

📅 2023-05-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the joint identification of drift and diffusion coefficients for multidimensional nonlinear stochastic differential equations (SDEs) under discrete-time observations. We propose a nonparametric learning framework grounded in reproducing kernel Hilbert spaces (RKHS): the solution to the associated Fokker–Planck equation is approximated within an RKHS, and its weak formulation serves as a physics-informed regularization constraint. Theoretically, we establish the first tight, non-asymptotic convergence rate bound—where the rate adapts to the Sobolev regularity of the unknown coefficients—bypassing restrictive assumptions (e.g., linearity or separability) or asymptotic regimes required by conventional methods. Algorithmically, the approach balances accuracy and efficiency, enabling scalable offline optimization and computationally tractable numerical implementation.
📝 Abstract
We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of multi-dimensional non-linear stochastic differential equations, which relies upon discrete-time observations of the state. The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations, yielding theoretical estimates of non-asymptotic learning rates which, unlike previous works, become increasingly tighter when the regularity of the unknown drift and diffusion coefficients becomes higher. Our method being kernel-based, offline pre-processing may be profitably leveraged to enable efficient numerical implementation, offering excellent balance between precision and computational complexity.
Problem

Research questions and friction points this paper is trying to address.

Non-parametric learning of SDEs
Fast convergence rates
RKHS-based Fokker-Planck approximation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Non-parametric learning paradigm
RKHS-based approximation
Kernel-based efficient implementation
🔎 Similar Papers
No similar papers found.