The Kernel Manifold: A Geometric Approach to Gaussian Process Model Selection

📅 2026-01-08
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a geometric framework for efficient kernel search in Gaussian process regression, addressing the high computational cost and inefficiency of existing methods. By introducing an expected divergence-based metric, the discrete space of composite kernels is modeled for the first time as a continuous Euclidean manifold amenable to embedding. This embedding is realized via multidimensional scaling (MDS), enabling Bayesian optimization to navigate the resulting smooth and stable geometric structure along efficient paths toward the optimal kernel. Experimental results demonstrate that the proposed approach significantly outperforms baseline methods—including those guided by large language models—across synthetic datasets, real-world time series, and melt pool prediction tasks in additive manufacturing, achieving consistent improvements in both predictive accuracy and uncertainty calibration.

Technology Category

Application Category

📝 Abstract
Gaussian Process (GP) regression is a powerful nonparametric Bayesian framework, but its performance depends critically on the choice of covariance kernel. Selecting an appropriate kernel is therefore central to model quality, yet remains one of the most challenging and computationally expensive steps in probabilistic modeling. We present a Bayesian optimization framework built on kernel-of-kernels geometry, using expected divergence-based distances between GP priors to explore kernel space efficiently. A multidimensional scaling (MDS) embedding of this distance matrix maps a discrete kernel library into a continuous Euclidean manifold, enabling smooth BO. In this formulation, the input space comprises kernel compositions, the objective is the log marginal likelihood, and featurization is given by the MDS coordinates. When the divergence yields a valid metric, the embedding preserves geometry and produces a stable BO landscape. We demonstrate the approach on synthetic benchmarks, real-world time-series datasets, and an additive manufacturing case study predicting melt-pool geometry, achieving superior predictive accuracy and uncertainty calibration relative to baselines including Large Language Model (LLM)-guided search. This framework establishes a reusable probabilistic geometry for kernel search, with direct relevance to GP modeling and deep kernel learning.
Problem

Research questions and friction points this paper is trying to address.

Gaussian Process
kernel selection
model selection
covariance kernel
Bayesian optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian Process
Kernel Selection
Bayesian Optimization
Manifold Embedding
Expected Divergence
🔎 Similar Papers
No similar papers found.
Md Shafiqul Islam
Md Shafiqul Islam
Professor of Mathematics, School of Mathematical and Computational Sciences,University of Prince
Dynamical SystemsErgodic TheoryRandom maps and applications
S
S. P. Padhy
Department of Materials Science and Engineering, Texas A&M University, College Station, 77843, TX, USA
Douglas Allaire
Douglas Allaire
Associate Professor, Texas A&M University
optimizationuncertainty quantificationmultifidelity methodsmachine learning
R
Raymundo Arróyave
Department of Materials Science and Engineering, Texas A&M University, College Station, 77843, TX, USA, J. Mike Walker ’66 Department of Mechanical Engineering, Texas A&M University, College Station, 77843, TX, USA