High-Dimensional Gaussian Process Regression with Soft Kernel Interpolation

📅 2024-10-28
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Structured Kernel Interpolation (SKI) suffers from the curse of dimensionality and scalability bottlenecks in high-dimensional settings due to its reliance on fixed, dense Cartesian grids. To address this, we propose Soft Kernel Interpolation (SoftKI), a novel method that unifies SKI with variational inducing point principles. SoftKI introduces learnable interpolation points and employs softmax-weighted combinations to achieve dynamic, sparse, and adaptive kernel approximation—marking the first integration of learnable interpolation with probabilistic weighting. By replacing rigid grid-based interpolation with differentiable, data-adaptive point selection, SoftKI circumvents the exponential growth of grid points in high dimensions. Empirically, it achieves significantly improved training efficiency on approximately 10-dimensional data, while matching or exceeding the predictive accuracy of state-of-the-art approximate Gaussian process methods. Crucially, SoftKI preserves numerical stability and enables end-to-end optimization of the marginal log-likelihood.

Technology Category

Application Category

📝 Abstract
We introduce Soft Kernel Interpolation (SoftKI), a method that combines aspects of Structured Kernel Interpolation (SKI) and variational inducing point methods, to achieve scalable Gaussian Process (GP) regression on high-dimensional datasets. SoftKI approximates a kernel via softmax interpolation from a smaller number of interpolation points learned by optimizing a combination of the SoftKI marginal log-likelihood (MLL), and when needed, an approximate MLL for improved numerical stability. Consequently, it can overcome the dimensionality scaling challenges that SKI faces when interpolating from a dense and static lattice while retaining the flexibility of variational methods to adapt inducing points to the dataset. We demonstrate the effectiveness of SoftKI across various examples and show that it is competitive with other approximated GP methods when the data dimensionality is modest (around 10).
Problem

Research questions and friction points this paper is trying to address.

Scalable Gaussian Process regression for high-dimensional datasets
Overcoming dimensionality scaling challenges in kernel interpolation
Combining flexibility of variational methods with efficient interpolation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Soft Kernel Interpolation combines SKI and variational methods
Optimizes SoftKI marginal log-likelihood for stability
Overcomes dimensionality scaling challenges of SKI
🔎 Similar Papers
No similar papers found.