🤖 AI Summary
Bayesian optimization (BO) of black-box functions invariant under group actions is hindered by existing maximum similarity kernels, which violate positive semi-definiteness (PSD) and thus cannot be directly employed in BO.
Method: We propose Symmetry-Aware Bayesian Optimization—a novel framework that constructs a valid PSD kernel via positive-definite projection of the non-PSD maximum similarity kernel, preserving exact symmetry-aware modeling without incurring additional computational overhead.
Contribution/Results: Theoretically and empirically, our approach significantly reduces cumulative regret. Across multiple synthetic and real-world benchmark tasks—including molecular property optimization and permutation-invariant combinatorial problems—it consistently outperforms both state-of-the-art invariance-aware kernels and standard kernels. Our method provides the first provably effective, plug-and-play paradigm for incorporating symmetry-structured priors into BO, ensuring rigorous PSD compliance while retaining invariance modeling fidelity.
📝 Abstract
Bayesian Optimization (BO) is a powerful framework for optimizing noisy, expensive-to-evaluate black-box functions. When the objective exhibits invariances under a group action, exploiting these symmetries can substantially improve BO efficiency. While using maximum similarity across group orbits has long been considered in other domains, the fact that the max kernel is not positive semidefinite (PSD) has prevented its use in BO. In this work, we revisit this idea by considering a PSD projection of the max kernel. Compared to existing invariant (and non-invariant) kernels, we show it achieves significantly lower regret on both synthetic and real-world BO benchmarks, without increasing computational complexity.