🤖 AI Summary
This work addresses automatic program optimization for large language models (LLMs) under black-box settings. We propose Retrieval-Augmented Search (RAS), a framework that leverages LLM-generated natural language descriptions to drive context-aware retrieval, thereby guiding beam search toward efficient exploration of optimization trajectories. Furthermore, we introduce Atomic-level Edit Generation and Inference Strategy (AEGIS), which decomposes training samples into fine-grained, interpretable atomic edit operations. Our core contributions are twofold: (1) the first natural language semantics–driven program retrieval mechanism, and (2) the first explicit modeling of program edits as atomic operations. Experiments demonstrate that RAS achieves 1.8× speedup over state-of-the-art black-box methods; AEGIS attains a 1.37× performance gain under smaller edit magnitudes, significantly improving controllability and interpretability of the optimization process.
📝 Abstract
With the advent of large language models (LLMs), there has been a great deal of interest in applying them to solve difficult programming tasks. Recent work has demonstrated their potential at program optimization, a key challenge in programming languages research. We propose a blackbox adaptation method called Retrieval Augmented Search (RAS) that performs beam search over candidate optimizations; at each step, it retrieves in-context examples from a given training dataset of slow-fast program pairs to guide the LLM. Critically, we find that performing contextual retrieval based on an LLM-generated natural language description significantly outperforms retrieval based on the source code. In addition, we propose a method called AEGIS for improving interpretability by decomposing training examples into"atomic edits"that are significantly more incremental in nature. We show that RAS performs 1.8$ imes$ better than prior state-of-the-art blackbox adaptation strategies, and that AEGIS performs 1.37$ imes$ better while performing significantly smaller edits.