Robust Guided Diffusion for Offline Black-Box Optimization

📅 2024-10-01
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
We address offline black-box optimization—maximizing an unknown objective function using only a static dataset of design-performance pairs. We propose a two-module framework: surrogate-augmented sampling and diffusion-driven surrogate refinement. To our knowledge, this is the first method to synergistically integrate classifier-free diffusion models with surrogate-guided closed-loop optimization. First, gradient-guided sampling leverages regression surrogates (e.g., MLP or GBDT) to provide explicit gradient directions for diffusion-based design generation. Second, the surrogate is dynamically retrained using diffusion-generated samples as prior knowledge, thereby correcting its out-of-distribution prediction bias. Our approach achieves state-of-the-art performance across all benchmarks in Design-Bench, significantly improving both high-value sample generation rates and generalization to unseen design regions. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Offline black-box optimization aims to maximize a black-box function using an offline dataset of designs and their measured properties. Two main approaches have emerged: the forward approach, which learns a mapping from input to its value, thereby acting as a proxy to guide optimization, and the inverse approach, which learns a mapping from value to input for conditional generation. (a) Although proxy-free~(classifier-free) diffusion shows promise in robustly modeling the inverse mapping, it lacks explicit guidance from proxies, essential for generating high-performance samples beyond the training distribution. Therefore, we propose extit{proxy-enhanced sampling} which utilizes the explicit guidance from a trained proxy to bolster proxy-free diffusion with enhanced sampling control. (b) Yet, the trained proxy is susceptible to out-of-distribution issues. To address this, we devise the module extit{diffusion-based proxy refinement}, which seamlessly integrates insights from proxy-free diffusion back into the proxy for refinement. To sum up, we propose extit{ extbf{R}obust extbf{G}uided extbf{D}iffusion for Offline Black-box Optimization}~( extbf{RGD}), combining the advantages of proxy~(explicit guidance) and proxy-free diffusion~(robustness) for effective conditional generation. RGD achieves state-of-the-art results on various design-bench tasks, underscoring its efficacy. Our code is at https://github.com/GGchen1997/RGD.
Problem

Research questions and friction points this paper is trying to address.

Offline Blackbox Optimization
Surrogate Model Guidance
Unknown Design Handling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Robust Guided Diffusion
Offline Black-box Optimization
Enhanced Sampling and Refinement
🔎 Similar Papers
No similar papers found.
C
Can Chen
McGill University, MILA - Quebec AI Institute
Christopher Beckham
Christopher Beckham
MILA - Quebec AI Institute, Polytechnique Montreal
Z
Zixuan Liu
University of Washington
X
Xue Liu
McGill University, MILA - Quebec AI Institute
C
C. Pal
MILA - Quebec AI Institute, Polytechnique Montreal, Canada CIFAR AI Chair