🤖 AI Summary
This work addresses the lack of a unified, high-performance Bayesian optimization framework in the R ecosystem capable of handling complex scenarios such as multi-objective optimization, batch suggestion, asynchronous parallelization, and mixed variable types. To bridge this gap, we introduce mlr3mbo, a modular Bayesian optimization framework built on mlr3 that comprehensively supports these advanced features for the first time in R and enables flexible composition of custom components. Through systematic hyperparameter tuning, we establish robust default configurations, allowing mlr3mbo to match or exceed the performance of leading optimizers—including HEBO, SMAC3, Ax, and Optuna—on the YAHPO Gym benchmark suite, thereby demonstrating its practical utility and state-of-the-art capabilities.
📝 Abstract
We present mlr3mbo, a comprehensive and modular toolbox for Bayesian optimization in R. mlr3mbo supports single- and multi-objective optimization, multi-point proposals, batch and asynchronous parallelization, input and output transformations, and robust error handling. While it can be used for many standard Bayesian optimization variants in applied settings, researchers can also construct custom BO algorithms from its flexible building blocks. In addition to an introduction to the software, its design principles, and its building blocks, the paper presents two extensive empirical evaluations of the software on the surrogate-based benchmark suite YAHPO Gym. To identify robust default configurations for both numeric and mixed-hierarchical optimization regimes, and to gain further insights into the respective impacts of individual settings, we run a coordinate descent search over the mlr3mbo configuration space and analyze its results. Furthermore, we demonstrate that mlr3mbo achieves state-of-the-art performance by benchmarking it against a wide range of optimizers, including HEBO, SMAC3, Ax, and Optuna.