Practical Efficient Global Optimization is No-regret

📅 2026-03-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the widely used yet theoretically underexplored efficient global optimization (EGO) algorithm with a nugget term, establishing for the first time a sublinear cumulative regret bound. By integrating a Gaussian process surrogate model, the expected improvement acquisition function, and nugget-based regularization of the covariance matrix, we rigorously prove that this practical EGO variant is no-regret under both squared exponential and Matérn kernels (with ν > 1/2). Our analysis quantifies the influence of the nugget on the regret bound and is complemented by empirical validation, which not only confirms the sublinear regret upper bound but also offers principled guidance for selecting the nugget parameter in practice.

Technology Category

Application Category

📝 Abstract
Efficient global optimization (EGO) is one of the most widely used noise-free Bayesian optimization algorithms.It comprises the Gaussian process (GP) surrogate model and expected improvement (EI) acquisition function. In practice, when EGO is applied, a scalar matrix of a small positive value (also called a nugget or jitter) is usually added to the covariance matrix of the deterministic GP to improve numerical stability. We refer to this EGO with a positive nugget as the practical EGO. Despite its wide adoption and empirical success, to date, cumulative regret bounds for practical EGO have yet to be established. In this paper, we present for the first time the cumulative regret upper bound of practical EGO. In particular, we show that practical EGO has sublinear cumulative regret bounds and thus is a no-regret algorithm for commonly used kernels including the squared exponential (SE) and Matérn kernels ($ν>\frac{1}{2}$). Moreover, we analyze the effect of the nugget on the regret bound and discuss the theoretical implication on its choice. Numerical experiments are conducted to support and validate our findings.
Problem

Research questions and friction points this paper is trying to address.

Efficient Global Optimization
Cumulative Regret
No-regret Algorithm
Gaussian Process
Nugget
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian optimization
expected improvement
Gaussian process
cumulative regret
nugget
🔎 Similar Papers
No similar papers found.