🤖 AI Summary
To address the high computational complexity and poor scalability of Gaussian process (GP) surrogate models in high-dimensional (>500D) expensive black-box optimization, this paper proposes *Order-Preserving Bayesian Optimization* (OPBO)—a novel paradigm that abandons exact function-value fitting in favor of modeling ordinal relationships between inputs and outputs. OPBO employs a lightweight order-preserving neural network as its surrogate and introduces an “adequately good” solution screening mechanism based on order sets, coupled with a gradient-free acquisition strategy. On multiple benchmark tasks exceeding 500 dimensions, OPBO significantly outperforms both GP-based and regression-based neural-network-driven Bayesian optimization methods, achieving faster convergence and reducing computational overhead by over an order of magnitude. The implementation is publicly available.
📝 Abstract
Bayesian optimization is an effective method for solving expensive black-box optimization problems. Most existing methods use Gaussian processes (GP) as the surrogate model for approximating the black-box objective function, it is well-known that it can fail in high-dimensional space (e.g., dimension over 500). We argue that the reliance of GP on precise numerical fitting is fundamentally ill-suited in high-dimensional space, where it leads to prohibitive computational complexity. In order to address this, we propose a simple order-preserving Bayesian optimization (OPBO) method, where the surrogate model preserves the order, instead of the value, of the black-box objective function. Then we can use a simple but effective OP neural network (NN) to replace GP as the surrogate model. Moreover, instead of searching for the best solution from the acquisition model, we select good-enough solutions in the ordinal set to reduce computational cost. The experimental results show that for high-dimensional (over 500) black-box optimization problems, the proposed OPBO significantly outperforms traditional BO methods based on regression NN and GP. The source code is available at https://github.com/pengwei222/OPBO.