Unbiased Stochastic Optimization for Gaussian Processes on Finite Dimensional RKHS

📅 2025-08-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Gaussian process (GP) methods for stochastic hyperparameter learning suffer from biased gradients or rely on inducing-point approximations, hindering convergence to stationary points of the true marginal likelihood. This paper proposes an unbiased stochastic inference framework grounded in finite-dimensional reproducing kernel Hilbert spaces (RKHS). It is the first to yield unbiased stochastic gradient estimates of the marginal likelihood—without inducing points or variational approximations—thereby guaranteeing convergence to stationary points of the exact objective. Theoretically, the approach extends to infinite-dimensional RKHS and remains robust under resource constraints. Experiments demonstrate that, in memory-limited settings—such as small mini-batches or few inducing points—the method significantly outperforms baselines including stochastic variational inference, achieving superior model fit and predictive performance.

Technology Category

Application Category

📝 Abstract
Current methods for stochastic hyperparameter learning in Gaussian Processes (GPs) rely on approximations, such as computing biased stochastic gradients or using inducing points in stochastic variational inference. However, when using such methods we are not guaranteed to converge to a stationary point of the true marginal likelihood. In this work, we propose algorithms for exact stochastic inference of GPs with kernels that induce a Reproducing Kernel Hilbert Space (RKHS) of moderate finite dimension. Our approach can also be extended to infinite dimensional RKHSs at the cost of forgoing exactness. Both for finite and infinite dimensional RKHSs, our method achieves better experimental results than existing methods when memory resources limit the feasible batch size and the possible number of inducing points.
Problem

Research questions and friction points this paper is trying to address.

Exact stochastic inference for Gaussian Processes
Overcoming biased approximations in hyperparameter learning
Addressing memory constraints in batch and inducing points
Innovation

Methods, ideas, or system contributions that make the work stand out.

Exact stochastic inference for finite dimensional RKHS
Extendable to infinite dimensional RKHS with approximations
Better performance under memory constraints than existing methods