🤖 AI Summary
Existing quantum Transformers (QTs) rely on deep parameterized quantum circuits (PQCs), rendering them highly susceptible to quantum hardware noise, inefficient to train, and lacking robustness. To address these limitations, we propose the Vectorized Quantum Transformer (VQT). First, we design learnable vectorized quantum blocks that enable efficient, nonlinear embedding of classical data into Hilbert space. Second, VQT supports quantum-accelerated approximation of masked attention matrices, eliminating gradient backpropagation through quantum circuits. Third, it adopts a sampling-efficient, gradient-free quantum simulation paradigm, substantially reducing classical computational overhead. Evaluated on IBM Kingston and IonQ quantum hardware, VQT achieves competitive performance on NLP benchmark tasks while maintaining high-fidelity circuit simulation accuracy. Crucially, it demonstrates superior training efficiency and enhanced robustness against quantum noise compared to conventional QTs.
📝 Abstract
Vectorized quantum block encoding provides a way to embed classical data into Hilbert space, offering a pathway for quantum models, such as Quantum Transformers (QT), that replace classical self-attention with quantum circuit simulations to operate more efficiently. Current QTs rely on deep parameterized quantum circuits (PQCs), rendering them vulnerable to QPU noise, and thus hindering their practical performance. In this paper, we propose the Vectorized Quantum Transformer (VQT), a model that supports ideal masked attention matrix computation through quantum approximation simulation and efficient training via vectorized nonlinear quantum encoder, yielding shot-efficient and gradient-free quantum circuit simulation (QCS) and reduced classical sampling overhead. In addition, we demonstrate an accuracy comparison for IBM and IonQ in quantum circuit simulation and competitive results in benchmarking natural language processing tasks on IBM state-of-the-art and high-fidelity Kingston QPU. Our noise intermediate-scale quantum friendly VQT approach unlocks a novel architecture for end-to-end machine learning in quantum computing.