🤖 AI Summary
Designing effective hybrid metaheuristics for single-objective continuous optimization remains challenging due to heavy reliance on expert knowledge and the vast combinatorial space of algorithm components.
Method: This paper proposes METAFOR, a modular hybrid metaheuristic framework that supports configurable and extensible automatic composition of PSO, DE, CMA-ES, and local search, integrated with irace for end-to-end algorithm auto-generation. It introduces a hierarchical training set construction and leave-one-class-out cross-validation strategy to systematically analyze component interactions.
Contribution/Results: METAFOR reveals optimal hybridization patterns and component contribution trends across problem classes. Evaluated on diverse benchmark suites, the 17 automatically generated hybrid algorithms significantly outperform tuned individual algorithms. This work provides the first empirical characterization of the effectiveness boundaries and applicability conditions of hybrid strategies—establishing foundational insights for principled hybrid metaheuristic design.
📝 Abstract
Hybrid metaheuristics are powerful techniques for solving difficult optimization problems that exploit the strengths of different approaches in a single implementation. For algorithm designers, however, creating hybrid metaheuristic implementations has become increasingly challenging due to the vast number of design options available in the literature and the fact that they often rely on their knowledge and intuition to come up with new algorithm designs. In this paper, we propose a modular metaheuristic software framework, called METAFOR, that can be coupled with an automatic algorithm configuration tool to automatically design hybrid metaheuristics. METAFOR is specifically designed to hybridize Particle Swarm Optimization, Differential Evolution and Covariance Matrix Adaptation-Evolution Strategy, and includes a local search module that allows their execution to be interleaved with a subordinate local search. We use the configuration tool irace to automatically generate 17 different metaheuristic implementations and evaluate their performance on a diverse set of continuous optimization problems. Our results show that, across all the considered problem classes, automatically generated hybrid implementations are able to outperform configured single-approach implementations, while these latter offer advantages on specific classes of functions. We provide useful insights on the type of hybridization that works best for specific problem classes, the algorithm components that contribute to the performance of the algorithms, and the advantages and disadvantages of two well-known instance separation strategies, creating stratified training set using a fix percentage and leave-one-class-out cross-validation.