From Understanding to Excelling: Template-Free Algorithm Design through Structural-Functional Co-Evolution

📅 2025-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing automated algorithm generation methods (e.g., EoH, FunSearch) rely on predefined templates and local function optimization, limiting their capacity for architecture-level co-evolution. Method: We propose a structure–function dual-dimensional co-evolution paradigm: (i) leveraging large language models for end-to-end semantic understanding and code generation from natural language specifications; (ii) introducing a coupled structure–function evaluation metric enabling multi-level module joint optimization and emergent algorithmic innovation; and (iii) employing closed-loop feedback to guide global search. Contribution/Results: Our approach eliminates dependence on handcrafted templates, significantly enhancing algorithmic design autonomy and architectural breakthrough capability. On multiple classical benchmarks, it outperforms baselines—including FunSearch and EoH—in both performance and novelty, successfully generating novel, high-efficiency algorithms that surpass human-designed counterparts. Moreover, it demonstrates strong adaptability to unseen environments.

Technology Category

Application Category

📝 Abstract
Large language models (LLMs) have greatly accelerated the automation of algorithm generation and optimization. However, current methods such as EoH and FunSearch mainly rely on predefined templates and expert-specified functions that focus solely on the local evolution of key functionalities. Consequently, they fail to fully leverage the synergistic benefits of the overall architecture and the potential of global optimization. In this paper, we introduce an end-to-end algorithm generation and optimization framework based on LLMs. Our approach utilizes the deep semantic understanding of LLMs to convert natural language requirements or human-authored papers into code solutions, and employs a two-dimensional co-evolution strategy to optimize both functional and structural aspects. This closed-loop process spans problem analysis, code generation, and global optimization, automatically identifying key algorithm modules for multi-level joint optimization and continually enhancing performance and design innovation. Extensive experiments demonstrate that our method outperforms traditional local optimization approaches in both performance and innovation, while also exhibiting strong adaptability to unknown environments and breakthrough potential in structural design. By building on human research, our framework generates and optimizes novel algorithms that surpass those designed by human experts, broadening the applicability of LLMs for algorithm design and providing a novel solution pathway for automated algorithm development.
Problem

Research questions and friction points this paper is trying to address.

Automating algorithm generation without predefined templates
Optimizing both functional and structural aspects globally
Enhancing performance and innovation in algorithm design
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLMs convert natural language to code
Two-dimensional co-evolution optimizes structure and function
Closed-loop process enhances algorithm performance and innovation
🔎 Similar Papers
No similar papers found.
Z
Zhe Zhao
University of Science and Technology of China, Hefei 230026, China
H
Haibin Wen
The Hong Kong University of Science and Technology (Guangzhou), Guangzhou 511458, China
P
Pengkun Wang
University of Science and Technology of China, Hefei 230026, China
Ye Wei
Ye Wei
City University of Hong Kong; Max Planck Institute
Learning of complex systemsData-driven optimization
Zaixi Zhang
Zaixi Zhang
Princeton University
AI for ScienceGenerative AIAI SecurityBioSecurity
X
Xi Lin
City University of Hong Kong, Hong Kong 999077, China
F
Fei Liu
City University of Hong Kong, Hong Kong 999077, China
B
Bo An
Nanyang Technological University, Singapore 639798, Singapore
Hui Xiong
Hui Xiong
Senior Scientist, Candela Corporation
Ultrafast dynamicsatomic molecular physicsfree electron laser
Y
Yang Wang
University of Science and Technology of China, Hefei 230026, China
Qingfu Zhang
Qingfu Zhang
Chair Professor, FIEEE, City University of Hong Kong
evolutionary computationmultiobjective optimizationcomputational intelligence