Respecting the limit:Bayesian optimization with a bound on the optimal value

📅 2024-11-07
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
For black-box optimization problems with prior knowledge of optimal-value bounds—either the exact minimum or a reliable lower bound—this paper proposes Bound-Aware Bayesian Optimization (BABO). The method introduces SlogGP, a novel surrogate model that explicitly incorporates logarithmic constraints to embed optimal-value bounds within the Gaussian process regression framework, enabling bound-aware modeling. Complementing SlogGP, we design a bound-aware Expected Improvement (EI) acquisition function. To our knowledge, this is the first approach to jointly integrate optimal-value bounds into both the surrogate model architecture and the acquisition function design. Theoretically and empirically, SlogGP demonstrates superior expressiveness in both bounded and unbounded settings, leading to significantly improved sampling efficiency. Extensive experiments across diverse benchmark tasks show consistent superiority over state-of-the-art Bayesian optimization methods; notably, even in the absence of prior bounds, SlogGP outperforms standard GP on most tasks.

Technology Category

Application Category

📝 Abstract
In many real-world optimization problems, we have prior information about what objective function values are achievable. In this paper, we study the scenario that we have either exact knowledge of the minimum value or a, possibly inexact, lower bound on its value. We propose bound-aware Bayesian optimization (BABO), a Bayesian optimization method that uses a new surrogate model and acquisition function to utilize such prior information. We present SlogGP, a new surrogate model that incorporates bound information and adapts the Expected Improvement (EI) acquisition function accordingly. Empirical results on a variety of benchmarks demonstrate the benefit of taking prior information about the optimal value into account, and that the proposed approach significantly outperforms existing techniques. Furthermore, we notice that even in the absence of prior information on the bound, the proposed SlogGP surrogate model still performs better than the standard GP model in most cases, which we explain by its larger expressiveness.
Problem

Research questions and friction points this paper is trying to address.

Incorporates prior knowledge of optimal value bounds in optimization.
Proposes a new surrogate model, SlogGP, for Bayesian optimization.
Demonstrates improved performance over existing optimization techniques.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bound-aware Bayesian optimization (BABO) method
SlogGP surrogate model with bound information
Enhanced Expected Improvement acquisition function
🔎 Similar Papers
No similar papers found.