π€ AI Summary
Accurately predicting performance bounds of quantum computations under time-varying noise remains challenging due to the dynamic and heterogeneous nature of noise sources.
Method: This paper introduces QuBoundβa novel framework that jointly predicts rigorous upper and lower performance bounds (rather than point estimates) by integrating performance decomposition with an LSTM-based encoder. It disentangles noise sources from historical performance trajectories and jointly encodes circuit topology and time-resolved noise characteristics to construct a temporal-aware boundary prediction model.
Contribution/Results: QuBound achieves over a 10Γ reduction in prediction interval width compared to existing analytical methods. Relative to quantum circuit simulation and learning-based baselines, it accelerates inference by more than six orders of magnitude (10βΆΓ), while guaranteeing that all predicted bounds strictly contain the true performance values. This strict containment significantly enhances reliability for quantum task scheduling and system-level resource management.
π Abstract
Quantum computing has significantly advanced in recent years, boasting devices with hundreds of quantum bits (qubits), hinting at its potential quantum advantage over classical computing. Yet, noise in quantum devices poses significant barriers to realizing this supremacy. Understanding noise's impact is crucial for reproducibility and application reuse; moreover, the next-generation quantum-centric supercomputing essentially requires efficient and accurate noise characterization to support system management (e.g., job scheduling), where ensuring correct functional performance (i.e., fidelity) of jobs on available quantum devices can even be higher-priority than traditional objectives. However, noise fluctuates over time, even on the same quantum device, which makes predicting the computational bounds for on-the-fly noise is vital. Noisy quantum simulation can offer insights but faces efficiency and scalability issues. In this work, we propose a data-driven workflow, namely QuBound, to predict computational performance bounds. It decomposes historical performance traces to isolate noise sources and devises a novel encoder to embed circuit and noise information processed by a Long Short-Term Memory (LSTM) network. For evaluation, we compare QuBound with a state-of-the-art learning-based predictor, which only generates a single performance value instead of a bound. Experimental results show that the result of the existing approach falls outside of performance bounds, while all predictions from our QuBound with the assistance of performance decomposition better fit the bounds. Moreover, QuBound can efficiently produce practical bounds for various circuits with over 106 speedup over simulation; in addition, the range from QuBound is over 10x narrower than the state-of-the-art analytical approach.