Q-Bridge: Code Translation for Quantum Machine Learning via LLMs

📅 2026-03-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of translating classical machine learning (CML) code into quantum machine learning (QML) implementations, a task hindered by the scarcity of high-quality datasets and robust translation frameworks. The authors propose the first reproducible LLM-driven code translation framework, introducing the CML-2-QML dataset and integrating supervised LoRA fine-tuning, self-iterative data augmentation, and a novel strategy that fuses verifiable and unverifiable code pairs. This approach enables the generation of QML code that is structurally aligned with its CML counterpart, deterministically correct, and consistent across diverse quantum architectures. Beyond demonstrating the feasibility of direct CML-to-QML translation, the method facilitates architectural innovation while significantly enhancing the fidelity and interpretability of the generated quantum code.
📝 Abstract
Large language models have recently shown potential in bridging the gap between classical machine learning and quantum machine learning. However, the lack of standardized, high-quality datasets and robust translation frameworks limits progress in this domain. We introduce Q-Bridge, an LLM-guided code translation framework that systematically converts CML implementations into executable QML variants. Our approach builds on a self-involving pipeline that iteratively expands a verified seed codebase into a large-scale dataset, CML-2-QML, integrating verifiable and unverifiable code pairs. The Q-Bridge model is fine-tuned using supervised LoRA adaptation for scalable and memory-efficient training, achieving faithful and interpretable quantum code generation across diverse architectures. Empirical analysis confirms the feasibility of direct CML-to-QML translation and reveals consistent structural alignment between classical and quantum paradigms. Case studies further demonstrate that Q-Bridge can maintain deterministic correctness and also enable creative architectural exploration. This work establishes the first reproducible framework and dataset for LLM-driven quantum code translation, offering a foundation for scalable quantum AI development.
Problem

Research questions and friction points this paper is trying to address.

quantum machine learning
code translation
large language models
dataset
classical-to-quantum
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLM-guided code translation
CML-to-QML conversion
LoRA fine-tuning
quantum machine learning
self-involving dataset expansion
🔎 Similar Papers
No similar papers found.