Local Entropy Search over Descent Sequences for Bayesian Optimization

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Global optimization is often infeasible and unnecessary in complex design spaces where only locally optimal regions accessible via descent trajectories are of practical interest. Method: This paper proposes Local Entropy Search (LES), a Bayesian optimization framework that focuses sampling on neighborhoods of descent sequences reachable by the optimizer. Its core innovation lies in propagating the posterior belief along gradient-descent dynamics to construct a probabilistic distribution over descent paths, and guiding acquisition via mutual information maximization—combining analytical entropy computation with Monte Carlo approximation for efficient information gain. Contribution/Results: LES enables precise exploration of locally optimal descent paths. Experiments demonstrate that LES significantly outperforms state-of-the-art local and global Bayesian optimization methods on high-dimensional synthetic benchmarks and standard test problems, achieving substantial improvements in sample efficiency.

Technology Category

Application Category

📝 Abstract
Searching large and complex design spaces for a global optimum can be infeasible and unnecessary. A practical alternative is to iteratively refine the neighborhood of an initial design using local optimization methods such as gradient descent. We propose local entropy search (LES), a Bayesian optimization paradigm that explicitly targets the solutions reachable by the descent sequences of iterative optimizers. The algorithm propagates the posterior belief over the objective through the optimizer, resulting in a probability distribution over descent sequences. It then selects the next evaluation by maximizing mutual information with that distribution, using a combination of analytic entropy calculations and Monte-Carlo sampling of descent sequences. Empirical results on high-complexity synthetic objectives and benchmark problems show that LES achieves strong sample efficiency compared to existing local and global Bayesian optimization methods.
Problem

Research questions and friction points this paper is trying to address.

Optimizing large design spaces with local search methods
Targeting solutions reachable through iterative descent sequences
Improving sample efficiency in Bayesian optimization algorithms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Local entropy search targets descent sequences
Propagates posterior belief through optimizer iterations
Maximizes mutual information using entropy calculations
🔎 Similar Papers
No similar papers found.
D
David Stenger
Institute for Data Science in Mechanical Engineering, RWTH Aachen University, Germany
A
Armin Lindicke
Institute for Data Science in Mechanical Engineering, RWTH Aachen University, Germany
Alexander von Rohr
Alexander von Rohr
TU Munich
Bayesian OptimizationReinforcement LearningControl Theory
Sebastian Trimpe
Sebastian Trimpe
Professor, RWTH Aachen University
ControlMachine LearningNetworked SystemsRobotics