Efficient optimization of expensive black-box simulators via marginal means, with application to neutrino detector design

📅 2025-08-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Optimizing high-dimensional, expensive black-box functions—e.g., simulations requiring hundreds of CPU hours per evaluation—remains challenging; existing “select-best” methods exhibit mediocre performance under limited function evaluations. This paper proposes a novel optimization framework based on the Marginal Mean Function (MMF), employing a generalized additive model combined with a transformed additive Gaussian process as a surrogate. An unknown link function is introduced to enable consistent estimation of the global optimum. Theoretically, the method mitigates the curse of dimensionality and achieves superior convergence rates. Numerical experiments and an application to neutrino detector design demonstrate that the proposed approach significantly outperforms state-of-the-art black-box optimization algorithms in high-dimensional settings, yielding superior designs with substantially fewer simulation evaluations.

Technology Category

Application Category

📝 Abstract
With advances in scientific computing, computer experiments are increasingly used for optimizing complex systems. However, for modern applications, e.g., the optimization of nuclear physics detectors, each experiment run can require hundreds of CPU hours, making the optimization of its black-box simulator over a high-dimensional space a challenging task. Given limited runs at inputs $mathbf{x}_1, cdots, mathbf{x}_n$, the best solution from these evaluated inputs can be far from optimal, particularly as dimensionality increases. Existing black-box methods, however, largely employ this ''pick-the-winner'' (PW) solution, which leads to mediocre optimization performance. To address this, we propose a new Black-box Optimization via Marginal Means (BOMM) approach. The key idea is a new estimator of a global optimizer $mathbf{x}^*$ that leverages the so-called marginal mean functions, which can be efficiently inferred with limited runs in high dimensions. Unlike PW, this estimator can select solutions beyond evaluated inputs for improved optimization performance. Assuming the objective function follows a generalized additive model with unknown link function and under mild conditions, we prove that the BOMM estimator not only is consistent for optimization, but also has an optimization rate that tempers the ''curse-of-dimensionality'' faced by existing methods, thus enabling better performance as dimensionality increases. We present a practical framework for implementing BOMM using the transformed additive Gaussian process surrogate model. Finally, we demonstrate the effectiveness of BOMM in numerical experiments and an application on neutrino detector optimization in nuclear physics.
Problem

Research questions and friction points this paper is trying to address.

Optimizing expensive black-box simulators efficiently
Addressing high-dimensional optimization challenges
Improving performance in neutrino detector design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses marginal mean functions for optimization
Transformed additive Gaussian process surrogate model
Consistent optimization rate in high dimensions
🔎 Similar Papers
No similar papers found.
Hwanwoo Kim
Hwanwoo Kim
Duke University
Simon Mak
Simon Mak
Assistant Professor of Statistical Science, Duke University
Bayesian StatisticsMachine LearningScientific ComputingUncertainty Quantification
A
Ann-Kathrin Schuetz
Nuclear Science Division, Lawrence Berkeley National Laboratory
A
Alan Poon
Nuclear Science Division, Lawrence Berkeley National Laboratory