🤖 AI Summary
To address the prohibitively high LLM invocation cost arising from single-solution evolution when integrating large language models (LLMs) with evolutionary algorithms (EAs) for complex optimization, this paper proposes a novel “solution-space evolution” paradigm. Instead of evolving individual solutions, it employs LLMs to generate parameterized programs that define structured solution spaces; evolution proceeds via a scoring-guided dynamic search over these spaces. This shifts the evolutionary unit from points to parameterized solution manifolds, drastically reducing LLM calls and overcoming scalability bottlenecks in high-dimensional optimization. The approach achieves breakthrough results on three classical problems: (i) improving the asymptotic lower bound on the cap set constant to $C geq 2.2203$; (ii) tightening the Shannon capacity lower bound for cyclic graphs; and (iii) discovering online bin-packing heuristics that surpass all known hand-crafted strategies.
📝 Abstract
While combining large language models (LLMs) with evolutionary algorithms (EAs) shows promise for solving complex optimization problems, current approaches typically evolve individual solutions, often incurring high LLM call costs. We introduce (X)-evolve, a paradigm-shifting method that instead evolves solution spaces (X) (sets of individual solutions) - subsets of the overall search space (S). In (X)-evolve, LLMs generate tunable programs wherein certain code snippets, designated as parameters, define a tunable solution space. A score-based search algorithm then efficiently explores this parametrically defined space, guided by feedback from objective function scores. This strategy enables broader and more efficient exploration, which can potentially accelerate convergence at a much lower search cost, requiring up to two orders of magnitude fewer LLM calls than prior leading methods. We demonstrate (X)-evolve's efficacy across three distinct hard optimization problems. For the cap set problem, we discover a larger partial admissible set, establishing a new tighter asymptotic lower bound for the cap set constant ((C ge 2.2203)). In information theory, we uncover a larger independent set for the 15-vertex cycle graph ((mathcal{C}_{15}^{oxtimes 5}), size 19,946), thereby raising the known lower bound on its Shannon capacity. Furthermore, for the NP-hard online bin packing problem, we generate heuristics that consistently outperform standard strategies across established benchmarks. By evolving solution spaces, our method considerably improves search effectiveness, making it possible to tackle high-dimensional problems that were previously computationally prohibitive.