π€ AI Summary
Expected Improvement (EI), a widely used acquisition function in Bayesian optimization, suffers from a lack of information-theoretic interpretation and difficulty in adaptive calibration. Method: This work establishes, for the first time, a unified variational inference framework linking EI and Max-value Entropy Search (MES), proving that EI is a special case of MES under a specific variational assumption. Building on this insight, we propose the Variational Entropy Search (VES) paradigm and instantiate it as VES-Gamma, which approximates the optimal valueβs posterior distribution using a Gamma distribution to achieve principled enhancement and automatic calibration of EI. Results: On standard benchmark functions and real-world black-box optimization tasks, VES-Gamma significantly improves sampling efficiency and convergence stability, demonstrating both theoretical consistency with entropy-based principles and practical superiority over existing methods.
π Abstract
Bayesian optimization is a widely used technique for optimizing black-box functions, with Expected Improvement (EI) being the most commonly utilized acquisition function in this domain. While EI is often viewed as distinct from other information-theoretic acquisition functions, such as entropy search (ES) and max-value entropy search (MES), our work reveals that EI can be considered a special case of MES when approached through variational inference (VI). In this context, we have developed the Variational Entropy Search (VES) methodology and the VES-Gamma algorithm, which adapts EI by incorporating principles from information-theoretic concepts. The efficacy of VES-Gamma is demonstrated across a variety of test functions and read datasets, highlighting its theoretical and practical utilities in Bayesian optimization scenarios.