Leveraging Axis-Aligned Subspaces for High-Dimensional Bayesian Optimization with Group Testing

📅 2025-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of unknown and unverifiable variable importance in high-dimensional black-box optimization, this paper proposes a two-stage axis-aligned Bayesian optimization framework. In the first stage, Group Testing (GT) adapted to continuous domains automatically identifies the truly active variables—those that genuinely influence the objective—without requiring prior sparsity assumptions. In the second stage, efficient Gaussian process optimization is performed exclusively within the inferred low-dimensional, axis-aligned subspace. This work pioneers the extension of group testing theory to continuous functions, enabling verifiable active-variable identification. By enforcing axis-aligned modeling, the method significantly improves interpretability and sampling efficiency. Empirical evaluation on standard benchmarks demonstrates faster convergence, lower sample complexity, and explicit, reliable variable importance ranking—outperforming state-of-the-art methods across all metrics.

Technology Category

Application Category

📝 Abstract
Bayesian optimization (BO ) is an effective method for optimizing expensive-to-evaluate black-box functions. While high-dimensional problems can be particularly challenging, due to the multitude of parameter choices and the potentially high number of data points required to fit the model, this limitation can be addressed if the problem satisfies simplifying assumptions. Axis-aligned subspace approaches, where few dimensions have a significant impact on the objective, motivated several algorithms for high-dimensional BO . However, the validity of this assumption is rarely verified, and the assumption is rarely exploited to its full extent. We propose a group testing ( GT) approach to identify active variables to facilitate efficient optimization in these domains. The proposed algorithm, Group Testing Bayesian Optimization (GTBO), first runs a testing phase where groups of variables are systematically selected and tested on whether they influence the objective, then terminates once active dimensions are identified. To that end, we extend the well-established GT theory to functions over continuous domains. In the second phase, GTBO guides optimization by placing more importance on the active dimensions. By leveraging the axis-aligned subspace assumption, GTBO outperforms state-of-the-art methods on benchmarks satisfying the assumption of axis-aligned subspaces, while offering improved interpretability.
Problem

Research questions and friction points this paper is trying to address.

Identify active variables in high-dimensional Bayesian optimization
Extend group testing theory to continuous domain functions
Optimize performance by focusing on influential dimensions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Group testing identifies active variables efficiently
Leverages axis-aligned subspaces for optimization
GTBO outperforms state-of-the-art methods
🔎 Similar Papers
2024-04-18Swarm and Evolutionary ComputationCitations: 4