Unexpected Improvements to Expected Improvement for Bayesian Optimization

📅 2023-10-31
🏛️ Neural Information Processing Systems
📈 Citations: 36
Influential: 2
📄 PDF
🤖 AI Summary
In Bayesian optimization (BO), Expected Improvement (EI) and its variants frequently suffer from numerical ill-conditioning—especially in high-dimensional, constrained, parallel, or noisy settings—leading to unstable performance and optimization failure. This work systematically identifies a shared numerical instability mechanism across EI, Expected Hypervolume Improvement (EHVI), and their extensions. To address this, we propose LogEI, a novel family of acquisition functions that applies a logarithmic transformation to achieve lossless reformulation: the global optimum is preserved exactly, while numerical robustness is significantly enhanced. We further derive closed-form analytical gradients and integrate them with efficient optimization strategies to guarantee stable convergence. Evaluated across diverse benchmarks—including constrained, multi-objective, parallel, and noisy BO tasks—LogEI consistently outperforms standard EI variants and matches or exceeds state-of-the-art methods. Our results empirically demonstrate that numerical robustness is a decisive factor for practical BO efficacy.
📝 Abstract
Expected Improvement (EI) is arguably the most popular acquisition function in Bayesian optimization and has found countless successful applications, but its performance is often exceeded by that of more recent methods. Notably, EI and its variants, including for the parallel and multi-objective settings, are challenging to optimize because their acquisition values vanish numerically in many regions. This difficulty generally increases as the number of observations, dimensionality of the search space, or the number of constraints grow, resulting in performance that is inconsistent across the literature and most often sub-optimal. Herein, we propose LogEI, a new family of acquisition functions whose members either have identical or approximately equal optima as their canonical counterparts, but are substantially easier to optimize numerically. We demonstrate that numerical pathologies manifest themselves in"classic"analytic EI, Expected Hypervolume Improvement (EHVI), as well as their constrained, noisy, and parallel variants, and propose corresponding reformulations that remedy these pathologies. Our empirical results show that members of the LogEI family of acquisition functions substantially improve on the optimization performance of their canonical counterparts and surprisingly, are on par with or exceed the performance of recent state-of-the-art acquisition functions, highlighting the understated role of numerical optimization in the literature.
Problem

Research questions and friction points this paper is trying to address.

Bayesian Optimization
Expected Improvement
Numerical Optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

LogEI
Bayesian Optimization
Numerical Stability
🔎 Similar Papers
No similar papers found.