🤖 AI Summary
Traditional methods for modeling high-dimensional dynamical systems suffer from the curse of dimensionality—inducing prohibitive computational complexity—and rely on restrictive prior assumptions such as sparsity or symmetry. To address this, we propose the Multivariate Occupancy Kernel (MOCK) method, the first vector-field learning approach achieving linear-in-dimension computational complexity, thereby breaking the quadratic complexity barrier inherent in explicit methods. MOCK operates within a vector-valued reproducing kernel Hilbert space and employs an implicit linear solver framework to nonparametrically learn unknown ordinary differential equations directly from trajectory samples—without requiring any structural prior knowledge of the underlying system. We evaluate MOCK on nine benchmark datasets spanning dimensions 2 to 1024. Results demonstrate that MOCK outperforms three baseline methods in full-trajectory prediction and four baselines in one-step-ahead prediction, achieving consistently superior overall performance.
📝 Abstract
Learning a nonparametric system of ordinary differential equations from trajectories in a $d$-dimensional state space requires learning $d$ functions of $d$ variables. Explicit formulations often scale quadratically in $d$ unless additional knowledge about system properties, such as sparsity and symmetries, is available. In this work, we propose a linear approach, the multivariate occupation kernel method (MOCK), using the implicit formulation provided by vector-valued reproducing kernel Hilbert spaces. The solution for the vector field relies on multivariate occupation kernel functions associated with the trajectories and scales linearly with the dimension of the state space. We validate through experiments on a variety of simulated and real datasets ranging from 2 to 1024 dimensions. MOCK outperforms all other comparators on 3 of the 9 datasets on full trajectory prediction and 4 out of the 9 datasets on next-point prediction.