Wasserstein-type Gaussian Process Regressions for Input Measurement Uncertainty

📅 2026-03-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Standard Gaussian process regression tends to yield overconfident posterior intervals and biased decisions when inputs are subject to measurement errors. To address this issue, this work models noisy inputs as probability measures and proposes the Deterministic Projection Wasserstein ARD Gaussian Process (PWAGP). By leveraging the Wasserstein distance, the method constructs a closed-form, positive-definite, and scalable covariance function for distributional inputs. Unlike approaches relying on latent variables or Monte Carlo sampling, PWAGP avoids stochastic approximations, thereby enhancing the transparency and robustness of uncertainty quantification. The resulting framework maintains computational efficiency while significantly improving predictive reliability under input noise.

Technology Category

Application Category

📝 Abstract
Gaussian process (GP) regression is widely used for uncertainty quantification, yet the standard formulation assumes noise-free covariates. When inputs are measured with error, this errors-in-variables (EIV) setting can lead to optimistically narrow posterior intervals and biased decisions. We study GP regression under input measurement uncertainty by representing each noisy input as a probability measure and defining covariance through Wasserstein distances between these measures. Building on this perspective, we instantiate a deterministic projected Wasserstein ARD (PWA) kernel whose one-dimensional components admit closed-form expressions and whose product structure yields a scalable, positive-definite kernel on distributions. Unlike latent-input GP models, PWA-based GPs (\PWAGPs) handle input noise without introducing unobserved covariates or Monte Carlo projections, making uncertainty quantification more transparent and robust.
Problem

Research questions and friction points this paper is trying to address.

Gaussian process regression
input measurement uncertainty
errors-in-variables
uncertainty quantification
Wasserstein distance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wasserstein distance
Gaussian process regression
errors-in-variables
input uncertainty
positive-definite kernel
🔎 Similar Papers
No similar papers found.
Hengrui Luo
Hengrui Luo
Unknown affiliation
X
Xiaoye S. Li
Lawrence Berkeley National Laboratory, Berkeley, CA 94709
Yang Liu
Yang Liu
Lawrence Berkeley National Laboratory
computational electromagneticssparse matrixnumerical linear algebrahigh performance computing
Marcus Noack
Marcus Noack
Lawrence Berkeley National Laboratory
MathematicsMathematical OptimizationGaussian Processes
J
Ji Qiang
Lawrence Berkeley National Laboratory, Berkeley, CA 94709
M
Mark D. Risser
Lawrence Berkeley National Laboratory, Berkeley, CA 94709