LLaMEA: A Large Language Model Evolutionary Algorithm for Automatically Generating Metaheuristics

📅 2024-05-30
🏛️ IEEE Transactions on Evolutionary Computation
📈 Citations: 2
Influential: 1
📄 PDF
🤖 AI Summary
To address the heavy reliance of metaheuristic design on expert knowledge, this paper proposes LLaMEA—a novel framework that integrates large language models (specifically GPT-4) deeply into the evolutionary algorithm paradigm for end-to-end automated generation and evolution of black-box optimization algorithms. LLaMEA employs prompt engineering to iteratively generate algorithmic descriptions, produce executable code, evaluate runtime performance feedback, and perform multi-objective selection and mutation—fully autonomously, without human intervention. The automatically generated algorithms significantly outperform CMA-ES and differential evolution on the BBOB 5D benchmark. Crucially, they exhibit strong cross-dimensional generalization: algorithms trained solely in 5D remain competitive on 10D and 20D tasks. This work empirically validates the feasibility and effectiveness of leveraging foundation models for automated algorithm design, establishing a new paradigm for intelligent optimization.

Technology Category

Application Category

📝 Abstract
Large Language Models (LLMs) such as GPT-4 have demonstrated their ability to understand natural language and generate complex code snippets. This paper introduces a novel Large Language Model Evolutionary Algorithm (LLaMEA) framework, leveraging GPT models for the automated generation and refinement of algorithms. Given a set of criteria and a task definition (the search space), LLaMEA iteratively generates, mutates and selects algorithms based on performance metrics and feedback from runtime evaluations. This framework offers a unique approach to generating optimized algorithms without requiring extensive prior expertise. We show how this framework can be used to generate novel black-box metaheuristic optimization algorithms automatically. LLaMEA generates multiple algorithms that outperform state-of-the-art optimization algorithms (Covariance Matrix Adaptation Evolution Strategy and Differential Evolution) on the five dimensional black box optimization benchmark (BBOB). The algorithms also show competitive performance on the 10- and 20-dimensional instances of the test functions, although they have not seen such instances during the automated generation process. The results demonstrate the feasibility of the framework and identify future directions for automated generation and optimization of algorithms via LLMs.
Problem

Research questions and friction points this paper is trying to address.

Large Language Models
Algorithm Optimization
Complex Problem Solving
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLaMEA
Algorithm Evolution
Large Language Models
🔎 Similar Papers
No similar papers found.