🤖 AI Summary
To address the high computational cost and slow convergence in hyperparameter optimization for Matérn-kernel Gaussian processes (GPs) in time-series modeling, this paper proposes a recursive Bayesian estimation framework that reformulates hyperparameters as latent states and couples them with the Matérn covariance function via an autoregressive structure. This enables online, low-complexity hyperparameter inference without relying on expensive gradient computations or sampling procedures required by marginal likelihood maximization (MLE) or Hamiltonian Monte Carlo (HMC). Experiments on multiple benchmark datasets demonstrate that the proposed method achieves an average 3.2× speedup in runtime over MLE and HMC, while reducing root-mean-square error by 18.7%. Moreover, it maintains strong generalization performance and robustness across diverse temporal patterns and noise conditions.
📝 Abstract
Gaussian processes are important models in the field of probabilistic numerics. We present a procedure for optimizing Matérn kernel temporal Gaussian processes with respect to the kernel covariance function's hyperparameters. It is based on casting the optimization problem as a recursive Bayesian estimation procedure for the parameters of an autoregressive model. We demonstrate that the proposed procedure outperforms maximizing the marginal likelihood as well as Hamiltonian Monte Carlo sampling, both in terms of runtime and ultimate root mean square error in Gaussian process regression.