Posterior Inference with Diffusion Models for High-dimensional Black-box Optimization

📅 2025-02-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the poor scalability of Bayesian optimization (BO) and weak uncertainty quantification in generative approaches for high-dimensional (≥100D) black-box function optimization, this paper proposes DiBO—a novel framework that introduces diffusion models to black-box optimization for the first time. DiBO formulates optimization as a posterior inference task and enables efficient amortized inference via fine-tuning. It jointly integrates ensemble surrogate models for robust uncertainty estimation and iteratively refines candidate solutions in a generative manner. DiBO overcomes fundamental dimensionality and evaluation-budget limitations of conventional BO, achieving significant improvements over state-of-the-art methods across diverse synthetic and real-world black-box benchmarks—particularly under kilo-scale (≈1,000) function evaluations. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Optimizing high-dimensional and complex black-box functions is crucial in numerous scientific applications. While Bayesian optimization (BO) is a powerful method for sample-efficient optimization, it struggles with the curse of dimensionality and scaling to thousands of evaluations. Recently, leveraging generative models to solve black-box optimization problems has emerged as a promising framework. However, those methods often underperform compared to BO methods due to limited expressivity and difficulty of uncertainty estimation in high-dimensional spaces. To overcome these issues, we introduce extbf{DiBO}, a novel framework for solving high-dimensional black-box optimization problems. Our method iterates two stages. First, we train a diffusion model to capture the data distribution and an ensemble of proxies to predict function values with uncertainty quantification. Second, we cast the candidate selection as a posterior inference problem to balance exploration and exploitation in high-dimensional spaces. Concretely, we fine-tune diffusion models to amortize posterior inference. Extensive experiments demonstrate that our method outperforms state-of-the-art baselines across various synthetic and real-world black-box optimization tasks. Our code is publicly available href{https://github.com/umkiyoung/DiBO}{here}
Problem

Research questions and friction points this paper is trying to address.

High-dimensional black-box optimization
Curse of dimensionality
Uncertainty estimation in optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Diffusion models for optimization
Ensemble proxies for uncertainty
Posterior inference balancing exploration
🔎 Similar Papers
2024-02-28International Conference on Artificial Intelligence and StatisticsCitations: 24