Bayesian Optimization over Bounded Domains with the Beta Product Kernel

📅 2025-06-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bayesian optimization (BO) often exhibits weak modeling capability on bounded domains (e.g., the unit hypercube). To address this, we propose a novel non-stationary Beta-product kernel, whose covariance function is constructed as the product of Beta distribution density functions—marking the first explicit incorporation of boundary-aware priors into kernel design. Theoretical analysis establishes exponential eigenvalue decay of this kernel, ensuring strong approximation capacity of the associated Gaussian process near domain boundaries (e.g., vertices and faces). Empirical evaluation across synthetic benchmark functions and real-world black-box optimization tasks—including Vision Transformer (ViT) and large language model (LLM) compression—demonstrates consistent superiority over standard Matérn and RBF kernels. Notably, in boundary-sensitive scenarios, our method achieves significantly faster convergence and higher accuracy in locating optimal solutions.

Technology Category

Application Category

📝 Abstract
Bayesian optimization with Gaussian processes (GP) is commonly used to optimize black-box functions. The Mat'ern and the Radial Basis Function (RBF) covariance functions are used frequently, but they do not make any assumptions about the domain of the function, which may limit their applicability in bounded domains. To address the limitation, we introduce the Beta kernel, a non-stationary kernel induced by a product of Beta distribution density functions. Such a formulation allows our kernel to naturally model functions on bounded domains. We present statistical evidence supporting the hypothesis that the kernel exhibits an exponential eigendecay rate, based on empirical analyses of its spectral properties across different settings. Our experimental results demonstrate the robustness of the Beta kernel in modeling functions with optima located near the faces or vertices of the unit hypercube. The experiments show that our kernel consistently outperforms a wide range of kernels, including the well-known Mat'ern and RBF, in different problems, including synthetic function optimization and the compression of vision and language models.
Problem

Research questions and friction points this paper is trying to address.

Optimizing black-box functions on bounded domains
Introducing Beta kernel for better bounded-domain modeling
Outperforming Matérn and RBF kernels in experiments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces Beta kernel for bounded domains
Exponential eigendecay rate demonstrated empirically
Outperforms Matérn and RBF kernels consistently
🔎 Similar Papers
No similar papers found.
H
Huy Hoang Nguyen
Research Unit of Health Sciences and Technology, Finland, University of Oulu, Oulu, Finland
H
Han Zhou
Department ESAT, Center for Processing Speech and Images, KU Leuven, Leuven, Belgium
M
Matthew B. Blaschko
Department ESAT, Center for Processing Speech and Images, KU Leuven, Leuven, Belgium
Aleksei Tiulpin
Aleksei Tiulpin
University of Oulu
Machine Learning in MedicineDeep LearningBayesian Inference