HiBBO: HiPPO-based Space Consistency for High-dimensional Bayesian Optimisation

📅 2025-10-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
High-dimensional Bayesian optimization (BO) suffers from data sparsity and poor scalability of surrogate models. While existing variational autoencoder (VAE)-based approaches mitigate this via dimensionality reduction, their reconstruction objective induces distributional mismatch between the latent and original spaces—degrading optimization performance. This paper proposes HiBBO, the first BO framework integrating the HiPPO (Hierarchical Permutation-invariant Projection Operator) sequence modeling mechanism into the VAE architecture. HiPPO explicitly enforces consistency of function responses in the latent space, thereby improving distributional fidelity. By preserving both spatial and statistical properties of high-dimensional functions within a low-dimensional latent representation, HiBBO accelerates convergence and enhances solution quality. Experiments demonstrate that HiBBO outperforms state-of-the-art VAE-BO methods across multiple high-dimensional benchmarks and exhibits strong generalization in real-world applications—including neural architecture search and materials science.

Technology Category

Application Category

📝 Abstract
Bayesian Optimisation (BO) is a powerful tool for optimising expensive blackbox functions but its effectiveness diminishes in highdimensional spaces due to sparse data and poor surrogate model scalability While Variational Autoencoder (VAE) based approaches address this by learning low-dimensional latent representations the reconstructionbased objective function often brings the functional distribution mismatch between the latent space and original space leading to suboptimal optimisation performance In this paper we first analyse the reason why reconstructiononly loss may lead to distribution mismatch and then propose HiBBO a novel BO framework that introduces the space consistency into the latent space construction in VAE using HiPPO - a method for longterm sequence modelling - to reduce the functional distribution mismatch between the latent space and original space Experiments on highdimensional benchmark tasks demonstrate that HiBBO outperforms existing VAEBO methods in convergence speed and solution quality Our work bridges the gap between high-dimensional sequence representation learning and efficient Bayesian Optimisation enabling broader applications in neural architecture search materials science and beyond.
Problem

Research questions and friction points this paper is trying to address.

Bayesian Optimization struggles with high-dimensional spaces due to sparse data
Reconstruction-based VAEs cause functional distribution mismatch in latent spaces
Existing methods show suboptimal performance in convergence and solution quality
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses HiPPO for latent space consistency
Reduces distribution mismatch in VAE-BO
Improves high-dimensional Bayesian Optimization performance
🔎 Similar Papers
No similar papers found.