🤖 AI Summary
Bayesian optimization (BO) of expensive black-box functions over complex input spaces—such as discrete or non-Euclidean domains—remains challenging. Existing generative BO (GBO) methods suffer from suboptimal convergence and solution quality due to reliance on a single latent space and inability to handle variable-dimensional inputs robustly.
Method: We propose a multimodal generative BO framework featuring: (i) parallel optimization across multiple cooperative latent spaces; (ii) a generative model (VAE/GAN) with positive-correlation constraints to preserve fidelity between latent representations and objective values; and (iii) two cross-space information exchange strategies to reconcile the trade-off between dimension selection and the accuracy–convergence rate balance.
Results: Evaluated on airfoil design, cantilever beam optimization, and area maximization tasks, our method achieves significantly faster convergence and higher-quality solutions than both single-latent-space GBO and conventional BO under limited evaluation budgets.
📝 Abstract
Many real-world problems, such as airfoil design, involve optimizing a black-box expensive objective function over complex-structured input space (e.g., discrete space or non-Euclidean space). By mapping the complex-structured input space into a latent space of dozens of variables, a two-stage procedure labeled as generative model-based optimization (GMO), in this article, shows promise in solving such problems. However, the latent dimension of GMO is hard to determine, which may trigger the conflicting issue between desirable solution accuracy and convergence rate. To address the above issue, we propose a multiform GMO approach, namely, generative multiform optimization (GMFoO), which conducts optimization over multiple latent spaces simultaneously to complement each other. More specifically, we devise a generative model which promotes a positive correlation between latent spaces to facilitate effective knowledge transfer in GMFoO. And furthermore, by using Bayesian optimization (BO) as the optimizer, we propose two strategies to exchange information between these latent spaces continuously. Experimental results are presented on airfoil and corbel design problems and an area maximization problem as well to demonstrate that our proposed GMFoO converges to better designs on a limited computational budget.