🤖 AI Summary
This work addresses black-box non-convex multimodal optimization—where multiple global and local optima must be identified simultaneously. Methodologically, it proposes a novel algorithm integrating variational inference, simulated annealing, and natural gradient descent: solution distributions are modeled via Gaussian mixture models; an annealing schedule dynamically balances exploration and exploitation; and natural gradient updates enhance robustness against ill-conditioned landscapes. Crucially, this is the first work to incorporate fitness shaping—a technique from evolutionary algorithms—into the variational inference framework, thereby improving solution diversity and interpretability. Empirically, the method significantly outperforms standard gradient descent and evolution strategies on benchmark multimodal problems. Furthermore, it demonstrates strong practical efficacy in planetary science inverse problems, successfully identifying multiple physically meaningful modes in real-world data.
📝 Abstract
We introduce a new multimodal optimization approach called Natural Variational Annealing (NVA) that combines the strengths of three foundational concepts to simultaneously search for multiple global and local modes of black-box nonconvex objectives. First, it implements a simultaneous search by using variational posteriors, such as, mixtures of Gaussians. Second, it applies annealing to gradually trade off exploration for exploitation. Finally, it learns the variational search distribution using natural-gradient learning where updates resemble well-known and easy-to-implement algorithms. The three concepts come together in NVA giving rise to new algorithms and also allowing us to incorporate"fitness shaping", a core concept from evolutionary algorithms. We assess the quality of search on simulations and compare them to methods using gradient descent and evolution strategies. We also provide an application to a real-world inverse problem in planetary science.