TIDE: Tuning-Integrated Dynamic Evolution for LLM-Based Automated Heuristic Design

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses a critical limitation in existing large language model (LLM)-driven heuristic design methods, which treat algorithm evolution as monolithic text generation and thereby neglect the coupling between discrete structures and continuous parameters. This oversight often leads to the premature discarding of high-quality solutions due to uncalibrated parameters and early convergence caused by simplistic similarity measures. To overcome these issues, we propose TIDE, a novel framework that achieves the first co-evolutionary decoupling of structural evolution and parameter optimization. TIDE employs a parallel island model guided by tree edit distance to preserve structural diversity at the outer level, while the inner level integrates LLM-based logical generation with differential mutation operators for precise parameter tuning. A UCB-based scheduler dynamically allocates prompting resources across strategies. Evaluated on nine combinatorial optimization problems, TIDE significantly outperforms state-of-the-art methods in solution quality, search efficiency, and computational cost.

Technology Category

Application Category

📝 Abstract
Although Large Language Models have advanced Automated Heuristic Design, treating algorithm evolution as a monolithic text generation task overlooks the coupling between discrete algorithmic structures and continuous numerical parameters. Consequently, existing methods often discard promising algorithms due to uncalibrated constants and suffer from premature convergence resulting from simple similarity metrics. To address these limitations, we propose TIDE, a Tuning-Integrated Dynamic Evolution framework designed to decouple structural reasoning from parameter optimization. TIDE features a nested architecture where an outer parallel island model utilizes Tree Similarity Edit Distance to drive structural diversity, while an inner loop integrates LLM-based logic generation with a differential mutation operator for parameter tuning. Additionally, a UCB-based scheduler dynamically prioritizes high-yield prompt strategies to optimize resource allocation. Extensive experiments across nine combinatorial optimization problems demonstrate that TIDE discovers heuristics that significantly outperform state-of-the-art baselines in solution quality while achieving improved search efficiency and reduced computational costs.
Problem

Research questions and friction points this paper is trying to address.

Automated Heuristic Design
Large Language Models
Algorithm Evolution
Parameter Optimization
Premature Convergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

TIDE
algorithm evolution
parameter tuning
structural diversity
LLM-based heuristic design
🔎 Similar Papers
No similar papers found.
C
Chentong Chen
School of Mathematics and Statistics, Xi’an Jiaotong University, Xi’an, China
M
Mengyuan Zhong
School of Mathematics and Statistics, Xi’an Jiaotong University, Xi’an, China
Ye Fan
Ye Fan
Computer Science, University of British Columbia
Computer GraphicsNumerical Simulation
J
Jialong Shi
School of Mathematics and Statistics, Xi’an Jiaotong University, Xi’an, China
Jianyong Sun
Jianyong Sun
School of Mathematics and Statistics, Xi'an Jiaotong University, China
evolutionary computationstatistical machine learning