🤖 AI Summary
This work addresses the joint identification of drift and diffusion coefficients for multidimensional nonlinear stochastic differential equations (SDEs) under discrete-time observations. We propose a nonparametric learning framework grounded in reproducing kernel Hilbert spaces (RKHS): the solution to the associated Fokker–Planck equation is approximated within an RKHS, and its weak formulation serves as a physics-informed regularization constraint. Theoretically, we establish the first tight, non-asymptotic convergence rate bound—where the rate adapts to the Sobolev regularity of the unknown coefficients—bypassing restrictive assumptions (e.g., linearity or separability) or asymptotic regimes required by conventional methods. Algorithmically, the approach balances accuracy and efficiency, enabling scalable offline optimization and computationally tractable numerical implementation.
📝 Abstract
We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of multi-dimensional non-linear stochastic differential equations, which relies upon discrete-time observations of the state. The key idea essentially consists of fitting a RKHS-based approximation of the corresponding Fokker-Planck equation to such observations, yielding theoretical estimates of non-asymptotic learning rates which, unlike previous works, become increasingly tighter when the regularity of the unknown drift and diffusion coefficients becomes higher. Our method being kernel-based, offline pre-processing may be profitably leveraged to enable efficient numerical implementation, offering excellent balance between precision and computational complexity.