Nonlinear Dimensionality Reduction Techniques for Bayesian Optimization

📅 2025-10-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the poor scalability of Bayesian optimization (BO) in high-dimensional spaces, this paper proposes a Variational Autoencoder (VAE)-driven Latent Space BO (LSBO) framework. Methodologically, it employs a VAE to learn a structured, low-dimensional latent manifold underlying the high-dimensional search space and integrates a sequential domain reduction strategy to dynamically narrow the search region. Deep metric learning is introduced to enhance representation consistency, while Matérn-5/2 kernel Gaussian processes—implemented via GPU-accelerated BoTorch—enable efficient, adaptive optimization in the latent space. The key innovation lies in the first coupling of sequential domain reduction with VAE-based manifold learning, complemented by an encoder dynamic retraining mechanism. Extensive experiments on multiple high-dimensional benchmark functions demonstrate that LSBO significantly outperforms baselines such as random projection, validating that structured latent manifolds substantially improve both optimization efficiency and convergence behavior.

Technology Category

Application Category

📝 Abstract
Bayesian optimisation (BO) is a standard approach for sample-efficient global optimisation of expensive black-box functions, yet its scalability to high dimensions remains challenging. Here, we investigate nonlinear dimensionality reduction techniques that reduce the problem to a sequence of low-dimensional Latent-Space BO (LSBO). While early LSBO methods used (linear) random projections (Wang et al., 2013), building on Grosnit et al. (2021), we employ Variational Autoencoders (VAEs) for LSBO, focusing on deep metric loss for structured latent manifolds and VAE retraining to adapt the encoder-decoder to newly sampled regions. We propose some changes in their implementation, originally designed for tasks such as molecule generation, and reformulate the algorithm for broader optimisation purposes. We then couple LSBO with Sequential Domain Reduction (SDR) directly in the latent space (SDR-LSBO), yielding an algorithm that narrows the latent search domains as evidence accumulates. Implemented in a GPU-accelerated BoTorch stack with Matern-5/2 Gaussian process surrogates, our numerical results show improved optimisation quality across benchmark tasks and that structured latent manifolds improve BO performance. Additionally, we compare random embeddings and VAEs as two mechanisms for dimensionality reduction, showing that the latter outperforms the former. To the best of our knowledge, this is the first study to combine SDR with VAE-based LSBO, and our analysis clarifies design choices for metric shaping and retraining that are critical for scalable latent space BO. For reproducibility, our source code is available at https://github.com/L-Lok/Nonlinear-Dimensionality-Reduction-Techniques-for-Bayesian-Optimization.git.
Problem

Research questions and friction points this paper is trying to address.

Scalable Bayesian optimization in high dimensions using nonlinear dimensionality reduction
Improving optimization quality via structured latent manifolds and VAE retraining
Combining sequential domain reduction with latent space Bayesian optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Variational Autoencoders for latent space dimensionality reduction
Combines Sequential Domain Reduction in latent space optimization
Implements structured latent manifolds with deep metric loss
🔎 Similar Papers
No similar papers found.
L
Luo Long
Mathematical Institute, University of Oxford, Radcliffe Observatory, Andrew Wiles Building, Woodstock Rd, Oxford, OX2 6GG, United Kingdom
Coralia Cartis
Coralia Cartis
University of Oxford
OptimizationNumerical AnalysisComplexityCompressed Sensing
P
Paz Fink Shustin
Mathematical Institute, University of Oxford, Radcliffe Observatory, Andrew Wiles Building, Woodstock Rd, Oxford, OX2 6GG, United Kingdom