We Still Don't Understand High-Dimensional Bayesian Optimization

📅 2025-11-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
High-dimensional Bayesian optimization (BO) suffers from the “curse of dimensionality,” with existing structure-exploiting approaches—e.g., those assuming sparsity or smoothness—often underperforming even Bayesian linear regression, revealing fundamental limitations of conventional heuristics. We identify the root cause as boundary-seeking behavior of standard kernel functions in high dimensions. To address this, we propose a minimalist yet effective paradigm: Gaussian processes with linear kernels, augmented by invertible geometric transformations to correct distributional shift and closed-form sampling for linear-time computational scaling. Our method consistently outperforms state-of-the-art high-dimensional BO algorithms across 60–6,000 dimensions and achieves new SOTA performance on ultra-large-scale tasks such as molecular optimization involving over 20,000 function evaluations. Crucially, it dispenses with complex structural priors, establishing a simpler, more scalable design principle for high-dimensional BO.

Technology Category

Application Category

📝 Abstract
High-dimensional spaces have challenged Bayesian optimization (BO). Existing methods aim to overcome this so-called curse of dimensionality by carefully encoding structural assumptions, from locality to sparsity to smoothness, into the optimization procedure. Surprisingly, we demonstrate that these approaches are outperformed by arguably the simplest method imaginable: Bayesian linear regression. After applying a geometric transformation to avoid boundary-seeking behavior, Gaussian processes with linear kernels match state-of-the-art performance on tasks with 60- to 6,000-dimensional search spaces. Linear models offer numerous advantages over their non-parametric counterparts: they afford closed-form sampling and their computation scales linearly with data, a fact we exploit on molecular optimization tasks with > 20,000 observations. Coupled with empirical analyses, our results suggest the need to depart from past intuitions about BO methods in high-dimensional spaces.
Problem

Research questions and friction points this paper is trying to address.

Addresses high-dimensional Bayesian optimization challenges
Compares structural assumption methods with simple linear regression
Proposes geometric transformation for improved Gaussian process performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Bayesian linear regression for high-dimensional optimization
Applies geometric transformation to avoid boundary-seeking behavior
Employs linear kernels in Gaussian processes for scalability
🔎 Similar Papers
No similar papers found.