🤖 AI Summary
High-dimensional black-box optimization—e.g., database configuration tuning—remains challenging due to prohibitive query costs and the curse of dimensionality.
Method: This paper proposes a hierarchical Bayesian optimization framework comprising a global navigator that adaptively partitions the search space via a search tree, and a local optimizer that incorporates partition-wise potential estimates into acquisition decisions. Crucially, a dynamic spatial partitioning mechanism is embedded directly into the acquisition function, enabling synergistic exploration and exploitation without additional hyperparameters. The method employs a potential-weighted Expected Improvement (EI) or Upper Confidence Bound (UCB) acquisition strategy.
Results: On high-dimensional synthetic benchmarks, it significantly outperforms state-of-the-art methods. In real-world database tuning tasks, it reduces query latency by up to 37%, demonstrating both effectiveness and practical applicability.
📝 Abstract
Optimizing black-box functions in high-dimensional search spaces has been known to be challenging for traditional Bayesian Optimization (BO). In this paper, we introduce HiBO, a novel hierarchical algorithm integrating global-level search space partitioning information into the acquisition strategy of a local BO-based optimizer. HiBO employs a search-tree-based global-level navigator to adaptively split the search space into partitions with different sampling potential. The local optimizer then utilizes this global-level information to guide its acquisition strategy towards most promising regions within the search space. A comprehensive set of evaluations demonstrates that HiBO outperforms state-of-the-art methods in high-dimensional synthetic benchmarks and presents significant practical effectiveness in the real-world task of tuning configurations of database management systems (DBMSs).