Pareto-Grid-Guided Large Language Models for Fast and High-Quality Heuristics Design in Multi-Objective Combinatorial Optimization

πŸ“… 2025-07-28
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing evolutionary algorithms for multi-objective combinatorial optimization problems (MOCOPs) suffer from heavy reliance on manual parameter tuning and poor generalizability, while current LLM-based approaches neglect Pareto balance and computational efficiency. To address these limitations, we propose ParetoGrid-LLM: a large language model–guided heuristic auto-design method leveraging Pareto grid structuring. Our approach discretizes the Pareto front into a structured grid to guide the LLM in generating high-quality, semantically diverse heuristic rules. These rules are integrated into a lightweight multi-objective evolutionary framework (SEMO) to enable efficient mutation and selection. Experiments across multiple standard MOCOP benchmarks demonstrate that ParetoGrid-LLM achieves solution-set quality (measured by IGD and HV) comparable to classical evolutionary algorithms, while reducing average runtime by 57%. It significantly outperforms existing LLM-driven methods and represents the first approach to achieve Pareto-directed, low-overhead, and highly generalizable automatic generation of multi-objective heuristics.

Technology Category

Application Category

πŸ“ Abstract
Multi-objective combinatorial optimization problems (MOCOP) frequently arise in practical applications that require the simultaneous optimization of conflicting objectives. Although traditional evolutionary algorithms can be effective, they typically depend on domain knowledge and repeated parameter tuning, limiting flexibility when applied to unseen MOCOP instances. Recently, integration of Large Language Models (LLMs) into evolutionary computation has opened new avenues for automatic heuristic generation, using their advanced language understanding and code synthesis capabilities. Nevertheless, most existing approaches predominantly focus on single-objective tasks, often neglecting key considerations such as runtime efficiency and heuristic diversity in multi-objective settings. To bridge this gap, we introduce Multi-heuristics for MOCOP via Pareto-Grid-guided Evolution of LLMs (MPaGE), a novel enhancement of the Simple Evolutionary Multiobjective Optimization (SEMO) framework that leverages LLMs and Pareto Front Grid (PFG) technique. By partitioning the objective space into grids and retaining top-performing candidates to guide heuristic generation, MPaGE utilizes LLMs to prioritize heuristics with semantically distinct logical structures during variation, thus promoting diversity and mitigating redundancy within the population. Through extensive evaluations, MPaGE demonstrates superior performance over existing LLM-based frameworks, and achieves competitive results to traditional Multi-objective evolutionary algorithms (MOEAs), with significantly faster runtime. Our code is available at: https://github.com/langkhachhoha/MPaGE.
Problem

Research questions and friction points this paper is trying to address.

Solving multi-objective combinatorial optimization with conflicting objectives
Enhancing heuristic diversity and runtime efficiency in LLM-based approaches
Bridging gap between traditional MOEAs and LLM-driven heuristic generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLMs integrated with Pareto-Grid for heuristic design
Grid-based objective space partitioning for top candidates
Promotes heuristic diversity via semantic logic variation
πŸ”Ž Similar Papers
No similar papers found.
M
Minh Hieu Ha
Hanoi University of Science and Technology, Vietnam
Hung Phan
Hung Phan
PhD Student in Computer Science at Iowa State University
Machine TranslationSoftware Engineering
T
Tung Duy Doan
Hanoi University of Science and Technology, Vietnam
T
Tung Dao
Hanoi University of Science and Technology, Vietnam
D
Dao Tran
FPT Software AI Center, Vietnam
Huynh Thi Thanh Binh
Huynh Thi Thanh Binh
Hanoi University of Science and Technology
Evolutionary ComputationArtificial IntelligenceMachine LearningOptimization