Bayesian Bridge Gaussian Process Regression

📅 2025-11-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational complexity, degraded predictive performance, and difficulty in variable selection inherent in high-dimensional Gaussian process regression (GPR), this paper proposes Bayesian Bridge Gaussian Process Regression (B²GPR). B²GPR is the first framework to integrate the Bayesian bridge prior—characterized by an ℓ<sub>q</sub> norm penalty (0 < q ≤ 2)—into GPR, enabling automatic sparse variable selection and interpretable modeling. To handle the nonstandard posterior distribution, we develop a hybrid Bayesian inference algorithm combining spherical Hamiltonian Monte Carlo (Spherical HMC) with Gibbs sampling. Theoretical analysis and extensive experiments on both synthetic and real-world datasets demonstrate that B²GPR significantly outperforms state-of-the-art methods: it achieves superior prediction accuracy while simultaneously enhancing identification of relevant variables and promoting model sparsity. Thus, B²GPR establishes a new paradigm for high-dimensional GPR that jointly ensures computational efficiency, predictive fidelity, and interpretability.

Technology Category

Application Category

📝 Abstract
The performance of Gaussian Process (GP) regression is often hampered by the curse of dimensionality, which inflates computational cost and reduces predictive power in high-dimensional problems. Variable selection is thus crucial for building efficient and accurate GP models. Inspired by Bayesian bridge regression, we propose the Bayesian Bridge Gaussian Process Regression (B extsuperscript{2}GPR) model. This framework places $ell_q$-norm constraints on key GP parameters to automatically induce sparsity and identify active variables. We formulate two distinct versions: one for $q=2$ using conjugate Gaussian priors, and another for $0<q<2$ that employs constrained flat priors, leading to non-standard, norm-constrained posterior distributions. To enable posterior inference, we design a Gibbs sampling algorithm that integrates Spherical Hamiltonian Monte Carlo (SphHMC) to efficiently sample from the constrained posteriors when $0<q<2$. Simulations and a real-data application confirm that B extsuperscript{2}GPR offers superior variable selection and prediction compared to alternative approaches.
Problem

Research questions and friction points this paper is trying to address.

Addresses curse of dimensionality in Gaussian Process regression
Performs automatic variable selection using Bayesian bridge constraints
Enables efficient sampling from norm-constrained posterior distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian bridge regression with lq-norm constraints
Gibbs sampling with Spherical Hamiltonian Monte Carlo
Sparsity induction for variable selection in GPR
🔎 Similar Papers
No similar papers found.