🤖 AI Summary
Quantum Approximate Optimization Algorithm (QAOA) circuits for Quadratic Unconstrained Binary Optimization (QUBO) problems—e.g., MaxCut—typically rely on gradient-based iterative parameter optimization, incurring high classical computational cost and poor generalization across problem instances.
Method: We propose the first end-to-end quantum circuit synthesis framework for QAOA based on a generative pre-trained Transformer (GPT). It encodes graph structures and QUBO instances as sequences and directly synthesizes high-fidelity, adaptively deep QAOA circuits without iterative optimization.
Contribution/Results: We introduce a novel, QAOA-specific synthetic dataset enabling cross-instance generalization and compact circuit design. Experiments demonstrate significant improvements in circuit generation efficiency on unseen graphs, completely eliminating gradient evaluations and multi-round parameter tuning. This reduces classical overhead and enhances QAOA’s scalability and practical applicability.
📝 Abstract
Quantum computing has the potential to improve our ability to solve certain optimization problems that are computationally difficult for classical computers, by offering new algorithmic approaches that may provide speedups under specific conditions. In this work, we introduce QAOA-GPT, a generative framework that leverages Generative Pretrained Transformers (GPT) to directly synthesize quantum circuits for solving quadratic unconstrained binary optimization problems, and demonstrate it on the MaxCut problem on graphs. To diversify the training circuits and ensure their quality, we have generated a synthetic dataset using the adaptive QAOA approach, a method that incrementally builds and optimizes problem-specific circuits. The experiments conducted on a curated set of graph instances demonstrate that QAOA-GPT, generates high quality quantum circuits for new problem instances unseen in the training as well as successfully parametrizes QAOA. Our results show that using QAOA-GPT to generate quantum circuits will significantly decrease both the computational overhead of classical QAOA and adaptive approaches that often use gradient evaluation to generate the circuit and the classical optimization of the circuit parameters. Our work shows that generative AI could be a promising avenue to generate compact quantum circuits in a scalable way.