Information Theoretic Bayesian Optimization over the Probability Simplex

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the efficient optimization of black-box, expensive, and potentially noisy objective functions defined over the probability simplex—a non-Euclidean space comprising non-negative vectors that sum to one. The authors propose α-GaBO, a novel method that, for the first time, integrates information geometry into the Bayesian optimization framework. By leveraging α-connections and Riemannian metrics, α-GaBO constructs a Matérn kernel consistent with the intrinsic geometry of the simplex and introduces a one-parameter family of geometry-aware optimizers to efficiently maximize the acquisition function. Extensive experiments on benchmark functions and real-world tasks—including mixture design, classifier ensemble weighting, and robotic control—demonstrate that α-GaBO significantly outperforms existing Euclidean-constrained approaches, confirming its superior modeling accuracy and generalization capability.

Technology Category

Application Category

📝 Abstract
Bayesian optimization is a data-efficient technique that has been shown to be extremely powerful to optimize expensive, black-box, and possibly noisy objective functions. Many applications involve optimizing probabilities and mixtures which naturally belong to the probability simplex, a constrained non-Euclidean domain defined by non-negative entries summing to one. This paper introduces $α$-GaBO, a novel family of Bayesian optimization algorithms over the probability simplex. Our approach is grounded in information geometry, a branch of Riemannian geometry which endows the simplex with a Riemannian metric and a class of connections. Based on information geometry theory, we construct Matérn kernels that reflect the geometry of the probability simplex, as well as a one-parameter family of geometric optimizers for the acquisition function. We validate our method on benchmark functions and on a variety of real-world applications including mixtures of components, mixtures of classifiers, and a robotic control task, showing its increased performance compared to constrained Euclidean approaches.
Problem

Research questions and friction points this paper is trying to address.

Bayesian optimization
probability simplex
information geometry
constrained optimization
black-box optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Information Geometry
Bayesian Optimization
Probability Simplex
Matérn Kernel
Riemannian Metric
🔎 Similar Papers
No similar papers found.