Omnipresent Yet Overlooked: Heat Kernels in Combinatorial Bayesian Optimization

📅 2025-10-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bayesian optimization (BO) for combinatorial domains suffers from a lack of unified theoretical foundations for kernel design. To address this, we propose a principled framework based on the heat kernel, establishing for the first time a systematic connection between combinatorial kernels and graph heat kernels. We prove that the resulting kernel is insensitive to optimal solution location and exhibits strong structural invariance. By integrating graph-theoretic principles with diffusion process modeling, we derive compact closed-form heat kernels applicable to diverse combinatorial structures—including sequences, trees, and graphs—and seamlessly embed them into standard BO pipelines. Theoretical analysis reveals that several existing combinatorial kernels are special cases of our heat kernel formulation. Experiments on benchmark tasks—including materials discovery and neural architecture search—demonstrate state-of-the-art performance, significantly outperforming both complex and computationally intensive baselines.

Technology Category

Application Category

📝 Abstract
Bayesian Optimization (BO) has the potential to solve various combinatorial tasks, ranging from materials science to neural architecture search. However, BO requires specialized kernels to effectively model combinatorial domains. Recent efforts have introduced several combinatorial kernels, but the relationships among them are not well understood. To bridge this gap, we develop a unifying framework based on heat kernels, which we derive in a systematic way and express as simple closed-form expressions. Using this framework, we prove that many successful combinatorial kernels are either related or equivalent to heat kernels, and validate this theoretical claim in our experiments. Moreover, our analysis confirms and extends the results presented in Bounce: certain algorithms' performance decreases substantially when the unknown optima of the function do not have a certain structure. In contrast, heat kernels are not sensitive to the location of the optima. Lastly, we show that a fast and simple pipeline, relying on heat kernels, is able to achieve state-of-the-art results, matching or even outperforming certain slow or complex algorithms.
Problem

Research questions and friction points this paper is trying to address.

Developing heat kernel framework for combinatorial Bayesian optimization
Analyzing relationships between existing combinatorial kernels and heat kernels
Evaluating heat kernel sensitivity to function optima structure
Innovation

Methods, ideas, or system contributions that make the work stand out.

Developed unifying framework using heat kernels
Proved equivalence of successful kernels to heat kernels
Achieved state-of-the-art results with simple pipeline
🔎 Similar Papers
No similar papers found.
C
Colin Doumont
ETH Zürich, University of Cambridge, Tübingen AI Center
Victor Picheny
Victor Picheny
Secondmind
Viacheslav Borovitskiy
Viacheslav Borovitskiy
Assistant Professor (Lecturer) at the University of Edinburgh
Machine LearningUncertainty QuantificationGeometric LearningHarmonic Analysis
H
Henry Moss
University of Cambridge, Lancaster University