🤖 AI Summary
This work addresses the challenge of global optimization for noisy black-box functions, particularly the tendency to become trapped in local optima in multimodal landscapes. The authors propose a novel neural network approach that uniquely integrates multimodal information—including function values, derivatives, and spline coefficients—within an iterative position-update mechanism to achieve robust global optimization without requiring gradient information or multiple random restarts. Trained end-to-end, the model progressively refines its estimate of the true global minimum by leveraging noisy observations and their spline approximations. Empirical evaluations demonstrate that the method reduces the average error to 8.05% on multimodal benchmark functions, with 72% of test cases achieving errors below 10%, representing a 28.18 percentage-point improvement over the initial spline-based estimates.
📝 Abstract
Global optimization of black-box functions from noisy samples is a fundamental challenge in machine learning and scientific computing. Traditional methods such as Bayesian Optimization often converge to local minima on multi-modal functions, while gradient-free methods require many function evaluations. We present a novel neural approach that learns to find global minima through iterative refinement. Our model takes noisy function samples and their fitted spline representation as input, then iteratively refines an initial guess toward the true global minimum. Trained on randomly generated functions with ground truth global minima obtained via exhaustive search, our method achieves a mean error of 8.05 percent on challenging multi-modal test functions, compared to 36.24 percent for the spline initialization, a 28.18 percent improvement. The model successfully finds global minima in 72 percent of test cases with error below 10 percent, demonstrating learned optimization principles rather than mere curve fitting. Our architecture combines encoding of multiple modalities including function values, derivatives, and spline coefficients with iterative position updates, enabling robust global optimization without requiring derivative information or multiple restarts.