OPBO: Order-Preserving Bayesian Optimization

📅 2025-12-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational complexity and poor scalability of Gaussian process (GP) surrogate models in high-dimensional (>500D) expensive black-box optimization, this paper proposes *Order-Preserving Bayesian Optimization* (OPBO)—a novel paradigm that abandons exact function-value fitting in favor of modeling ordinal relationships between inputs and outputs. OPBO employs a lightweight order-preserving neural network as its surrogate and introduces an “adequately good” solution screening mechanism based on order sets, coupled with a gradient-free acquisition strategy. On multiple benchmark tasks exceeding 500 dimensions, OPBO significantly outperforms both GP-based and regression-based neural-network-driven Bayesian optimization methods, achieving faster convergence and reducing computational overhead by over an order of magnitude. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Bayesian optimization is an effective method for solving expensive black-box optimization problems. Most existing methods use Gaussian processes (GP) as the surrogate model for approximating the black-box objective function, it is well-known that it can fail in high-dimensional space (e.g., dimension over 500). We argue that the reliance of GP on precise numerical fitting is fundamentally ill-suited in high-dimensional space, where it leads to prohibitive computational complexity. In order to address this, we propose a simple order-preserving Bayesian optimization (OPBO) method, where the surrogate model preserves the order, instead of the value, of the black-box objective function. Then we can use a simple but effective OP neural network (NN) to replace GP as the surrogate model. Moreover, instead of searching for the best solution from the acquisition model, we select good-enough solutions in the ordinal set to reduce computational cost. The experimental results show that for high-dimensional (over 500) black-box optimization problems, the proposed OPBO significantly outperforms traditional BO methods based on regression NN and GP. The source code is available at https://github.com/pengwei222/OPBO.
Problem

Research questions and friction points this paper is trying to address.

Addresses high-dimensional black-box optimization limitations
Replaces Gaussian processes with order-preserving neural networks
Reduces computational cost by selecting good-enough solutions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Order-preserving surrogate model replaces Gaussian processes
OP neural network used instead of Gaussian processes
Select good-enough solutions from ordinal set
W
Wei Peng
School of Automation Science and Engineering, Xi’an Jiaotong University, Xi’an, China
Jianchen Hu
Jianchen Hu
School of Automation Science and Engineering, Xi’an Jiaotong University, Xi’an, China
K
Kang Liu
School of Future Technology, Xi’an Jiaotong University, Xi’an, China
Qiaozhu Zhai
Qiaozhu Zhai
Xi'an Jiaotong University