Guided Tensor Lifting

📅 2025-04-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of automatically migrating legacy code to high-performance machine learning domain-specific languages (DSLs)—a task traditionally hindered by heavy reliance on manual heuristics and poor scalability. We propose an LLM-driven, probabilistic grammar-guided enumerative synthesis method. Our approach uniquely leverages large language models to automatically learn and model domain-specific migration rules, encoding them as probabilistic context-free grammars (PCFGs) to eliminate manual rule design and hard-coding. Integrated with efficient enumerative synthesis and DSL compiler optimizations, our method achieves significant improvements over existing state-of-the-art tools across multiple benchmarks: it delivers superior correctness rates, markedly accelerates inference, and demonstrates strong generalization—without any human intervention throughout the process.

Technology Category

Application Category

📝 Abstract
Domain-specific languages (DSLs) for machine learning are revolutionizing the speed and efficiency of machine learning workloads as they enable users easy access to high-performance compiler optimizations and accelerators. However, to take advantage of these capabilities, a user must first translate their legacy code from the language it is currently written in, into the new DSL. The process of automatically lifting code into these DSLs has been identified by several recent works, which propose program synthesis as a solution. However, synthesis is expensive and struggles to scale without carefully designed and hard-wired heuristics. In this paper, we present an approach for lifting that combines an enumerative synthesis approach with a Large Language Model used to automatically learn the domain-specific heuristics for program lifting, in the form of a probabilistic grammar. Our approach outperforms the state-of-the-art tools in this area, despite only using learned heuristics.
Problem

Research questions and friction points this paper is trying to address.

Automatically translating legacy code into domain-specific languages
Overcoming scalability issues in program synthesis for code lifting
Learning domain-specific heuristics using Large Language Models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines enumerative synthesis with Large Language Model
Learns domain-specific heuristics via probabilistic grammar
Outperforms state-of-the-art tools using learned heuristics
🔎 Similar Papers
No similar papers found.
Y
Yixuan Li
University of Edinburgh, United Kingdom
J
Jos'e Wesley de Souza Magalhaes
University of Edinburgh, United Kingdom
Alexander Brauckmann
Alexander Brauckmann
University of Edinburgh
M
Michael F. P. O'Boyle
University of Edinburgh, United Kingdom
Elizabeth Polgreen
Elizabeth Polgreen
Lecturer, University of Edinburgh
automated verification