Posterior Inference in Latent Space for Scalable Constrained Black-box Optimization

📅 2025-07-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
High-dimensional black-box constrained optimization faces challenges including intractable feasible region identification, the curse of dimensionality in Bayesian optimization (BO), and poor scalability and mode collapse in generative modeling. Method: We propose a novel framework integrating generative modeling with BO, where candidate solution sampling is formulated as posterior inference in a latent space. Leveraging normalizing flow-based generative models, our approach enables constraint-aware distribution learning and uncertainty quantification, while amortized inference ensures efficient sampling. Contribution/Results: The method effectively mitigates optimization difficulties arising from multimodality and hard constraints, avoids mode collapse, and significantly improves scalability to high dimensions. Extensive experiments on synthetic benchmarks and real-world tasks demonstrate superior convergence speed and robustness compared to state-of-the-art methods.

Technology Category

Application Category

📝 Abstract
Optimizing high-dimensional black-box functions under black-box constraints is a pervasive task in a wide range of scientific and engineering problems. These problems are typically harder than unconstrained problems due to hard-to-find feasible regions. While Bayesian optimization (BO) methods have been developed to solve such problems, they often struggle with the curse of dimensionality. Recently, generative model-based approaches have emerged as a promising alternative for constrained optimization. However, they suffer from poor scalability and are vulnerable to mode collapse, particularly when the target distribution is highly multi-modal. In this paper, we propose a new framework to overcome these challenges. Our method iterates through two stages. First, we train flow-based models to capture the data distribution and surrogate models that predict both function values and constraint violations with uncertainty quantification. Second, we cast the candidate selection problem as a posterior inference problem to effectively search for promising candidates that have high objective values while not violating the constraints. During posterior inference, we find that the posterior distribution is highly multi-modal and has a large plateau due to constraints, especially when constraint feedback is given as binary indicators of feasibility. To mitigate this issue, we amortize the sampling from the posterior distribution in the latent space of flow-based models, which is much smoother than that in the data space. We empirically demonstrate that our method achieves superior performance on various synthetic and real-world constrained black-box optimization tasks. Our code is publicly available href{https://github.com/umkiyoung/CiBO}{here}.
Problem

Research questions and friction points this paper is trying to address.

Optimizing high-dimensional black-box functions with constraints
Overcoming scalability and mode collapse in generative model-based optimization
Handling multi-modal posterior distributions in constrained optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Flow-based models capture data distribution
Posterior inference in latent space
Amortized sampling for multi-modal posterior
🔎 Similar Papers
No similar papers found.