Understanding High-Dimensional Bayesian Optimization

📅 2025-02-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional Bayesian optimization (HBO), simple methods often outperform sophisticated ones due to gradient vanishing in Gaussian processes (GPs) caused by poor initialization—undermining global surrogate modeling—while local search strategies better suit sparse, high-dimensional response landscapes. Method: We propose MSR (Maximum-Likelihood-based Scale-adaptive Refinement), a length-scale adaptation strategy grounded in maximum likelihood estimation that explicitly enhances local exploration capability to mitigate gradient vanishing. Contribution/Results: Through rigorous theoretical analysis and targeted ablation experiments, we validate MSR’s mechanism. Evaluated on multiple real-world black-box optimization benchmarks, MSR consistently achieves state-of-the-art (SOTA) performance, significantly surpassing existing HBO methods. Our work establishes a new paradigm for high-dimensional Bayesian optimization and delivers a principled, practical solution.

Technology Category

Application Category

📝 Abstract
Recent work reported that simple Bayesian optimization methods perform well for high-dimensional real-world tasks, seemingly contradicting prior work and tribal knowledge. This paper investigates the 'why'. We identify fundamental challenges that arise in high-dimensional Bayesian optimization and explain why recent methods succeed. Our analysis shows that vanishing gradients caused by Gaussian process initialization schemes play a major role in the failures of high-dimensional Bayesian optimization and that methods that promote local search behaviors are better suited for the task. We find that maximum likelihood estimation of Gaussian process length scales suffices for state-of-the-art performance. Based on this, we propose a simple variant of maximum likelihood estimation called MSR that leverages these findings to achieve state-of-the-art performance on a comprehensive set of real-world applications. We also present targeted experiments to illustrate and confirm our findings.
Problem

Research questions and friction points this paper is trying to address.

Investigates high-dimensional Bayesian optimization challenges
Explains success of recent Bayesian optimization methods
Proposes MSR for state-of-the-art performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Investigates high-dimensional Bayesian optimization failures
Proposes MSR for maximum likelihood estimation
Promotes local search behaviors in optimization
🔎 Similar Papers
No similar papers found.
L
Leonard Papenmeier
Department of Computer Science, Lund University, Lund, Sweden
M
Matthias Poloczek
Amazon (This research does not relate to Matthias’ work at Amazon.)
Luigi Nardi
Luigi Nardi
Associate Professor in Machine Learning at Lund University
Machine LearningBayesian Optimization