Bayesian autoregression to optimize temporal Matérn kernel Gaussian process hyperparameters

📅 2025-08-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and slow convergence in hyperparameter optimization for Matérn-kernel Gaussian processes (GPs) in time-series modeling, this paper proposes a recursive Bayesian estimation framework that reformulates hyperparameters as latent states and couples them with the Matérn covariance function via an autoregressive structure. This enables online, low-complexity hyperparameter inference without relying on expensive gradient computations or sampling procedures required by marginal likelihood maximization (MLE) or Hamiltonian Monte Carlo (HMC). Experiments on multiple benchmark datasets demonstrate that the proposed method achieves an average 3.2× speedup in runtime over MLE and HMC, while reducing root-mean-square error by 18.7%. Moreover, it maintains strong generalization performance and robustness across diverse temporal patterns and noise conditions.

Technology Category

Application Category

📝 Abstract
Gaussian processes are important models in the field of probabilistic numerics. We present a procedure for optimizing Matérn kernel temporal Gaussian processes with respect to the kernel covariance function's hyperparameters. It is based on casting the optimization problem as a recursive Bayesian estimation procedure for the parameters of an autoregressive model. We demonstrate that the proposed procedure outperforms maximizing the marginal likelihood as well as Hamiltonian Monte Carlo sampling, both in terms of runtime and ultimate root mean square error in Gaussian process regression.
Problem

Research questions and friction points this paper is trying to address.

Optimizing hyperparameters for temporal Matérn kernel Gaussian processes
Improving upon marginal likelihood maximization and HMC sampling
Enhancing runtime and accuracy in Gaussian process regression
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian autoregression optimizes hyperparameters
Recursive Bayesian estimation for autoregressive model parameters
Outperforms marginal likelihood and Hamiltonian Monte Carlo
🔎 Similar Papers
No similar papers found.