Task-free Adaptive Meta Black-box Optimization

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the inefficiency of handcrafted optimizers in complex black-box tasks and the limited generalization of existing meta-learning approaches, which rely heavily on a large set of predefined training tasks. To overcome these limitations, the authors propose the Adaptive Black-Box Optimization Model (ABOM), introducing a novel zero-shot meta-optimization paradigm that requires no pretraining tasks. ABOM integrates meta-learning and optimization into a closed-loop system, employing a parameterized evolutionary operator that adaptively refines its search strategy online using only data generated during the target optimization process. Experiments on synthetic benchmarks and real-world applications such as drone path planning demonstrate that ABOM matches or even surpasses conventional optimizers in performance. Visualization analyses further reveal that the model spontaneously evolves efficient search mechanisms reminiscent of natural selection and genetic recombination.

Technology Category

Application Category

📝 Abstract
Handcrafted optimizers become prohibitively inefficient for complex black-box optimization (BBO) tasks. MetaBBO addresses this challenge by meta-learning to automatically configure optimizers for low-level BBO tasks, thereby eliminating heuristic dependencies. However, existing methods typically require extensive handcrafted training tasks to learn meta-strategies that generalize to target tasks, which poses a critical limitation for realistic applications with unknown task distributions. To overcome the issue, we propose the Adaptive meta Black-box Optimization Model (ABOM), which performs online parameter adaptation using solely optimization data from the target task, obviating the need for predefined task distributions. Unlike conventional metaBBO frameworks that decouple meta-training and optimization phases, ABOM introduces a closed-loop adaptive parameter learning mechanism, where parameterized evolutionary operators continuously self-update by leveraging generated populations during optimization. This paradigm shift enables zero-shot optimization: ABOM achieves competitive performance on synthetic BBO benchmarks and realistic unmanned aerial vehicle path planning problems without any handcrafted training tasks. Visualization studies reveal that parameterized evolutionary operators exhibit statistically significant search patterns, including natural selection and genetic recombination.
Problem

Research questions and friction points this paper is trying to address.

Black-box Optimization
Meta-learning
Task Distribution
Adaptive Optimization
Zero-shot Optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive Meta Black-box Optimization
Zero-shot Optimization
Online Parameter Adaptation
Evolutionary Operators
Closed-loop Meta-learning
🔎 Similar Papers
No similar papers found.