🤖 AI Summary
This work proposes a novel approach to batch acquisition function optimization in multi-objective Bayesian optimization by systematically integrating simulated annealing—a metaheuristic algorithm—as a replacement for conventional gradient-based optimizers such as SLSQP. The method aims to better balance diversity and convergence of the Pareto front approximation, particularly in high-dimensional or complex objective spaces where gradient-based methods often become trapped in local optima. Implemented within the q-Expected Hypervolume Improvement (qEHVI) framework, the proposed approach is evaluated on benchmark problems including ZDT1, DTLZ2, Kursawe, and Latent-Aware. Experimental results demonstrate consistent improvements in hypervolume metrics and Pareto front coverage across most test cases, with especially notable gains observed on DTLZ2 and Latent-Aware problems.
📝 Abstract
Bayesian Optimization with multi-objective acquisition functions such as q-Expected Hypervolume Improvement (qEHVI) requires efficient candidate optimization to maximize acquisition function values. Traditional approaches rely on continuous optimization methods like Sequential Least Squares Programming (SLSQP) for candidate selection. However, these gradient-based methods can become trapped in local optima, particularly in complex or high-dimensional objective landscapes. This paper presents a simulated annealing-based approach for candidate optimization in batch acquisition functions as an alternative to conventional continuous optimization methods. We evaluate our simulated annealing approach against SLSQP across four benchmark multi-objective optimization problems: ZDT1 (30D, 2 objectives), DTLZ2 (7D, 3 objectives), Kursawe (3D, 2 objectives), and Latent-Aware (4D, 2 objectives). Our results demonstrate that simulated annealing consistently achieves superior hypervolume performance compared to SLSQP in most test functions. The improvement is particularly pronounced for DTLZ2 and Latent-Aware problems, where simulated annealing reaches significantly higher hypervolume values and maintains better convergence characteristics. The histogram analysis of objective space coverage further reveals that simulated annealing explores more diverse and optimal regions of the Pareto front. These findings suggest that metaheuristic optimization approaches like simulated annealing can provide more robust and effective candidate optimization for multi-objective Bayesian optimization, offering a promising alternative to traditional gradient-based methods for batch acquisition function optimization.