QuaRK: A Quantum Reservoir Kernel for Time Series Learning

📅 2026-02-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing quantum reservoir computing approaches lack efficient architectures and theoretical guarantees, limiting their reliability in modeling time series. This work proposes QuaRK, an end-to-end framework that integrates a hardware-friendly quantum reservoir feature extractor with a kernel-based readout mechanism. It leverages classical shadow tomography to efficiently estimate k-local observables for feature construction and employs regularized kernel methods to learn nonlinear temporal mappings. QuaRK is the first to combine classical shadow tomography with kernel methods, providing generalization error bounds for β-mixing dependent time series and explicitly characterizing the impact of circuit width, depth, and measurement budget on performance. Experiments on synthetic β-mixing sequences demonstrate QuaRK’s interpolation and generalization capabilities, with theoretical bounds effectively guiding practical system design.

Technology Category

Application Category

📝 Abstract
Quantum reservoir computing offers a promising route for time series learning by modelling sequential data via rich quantum dynamics while the only training required happens at the level of a lightweight classical readout. However, studies featuring efficient and implementable quantum reservoir architectures along with model learning guarantees remain scarce in the literature. To close this gap, we introduce QuaRK, an end-to-end framework that couples a hardware-realistic quantum reservoir featurizer with a kernel-based readout scheme. Given a sequence of sample points, the reservoir injects the points one after the other to yield a compact feature vector from efficiently measured k-local observables using classical shadow tomography, after which a classical kernel-based readout learns the target mapping with explicit regularization and fast optimization. The resulting pipeline exposes clear computational knobs -- circuit width and depth as well as the measurement budget -- while preserving the flexibility of kernel methods to model nonlinear temporal functionals and being scalable to high-dimensional data. We further provide learning-theoretic generalization guarantees for dependent temporal data, linking design and resource choices to finite-sample performance, thereby offering principled guidance for building reliable temporal learners. Empirical experiments validate QuaRK and illustrate the predicted interpolation and generalization behaviours on synthetic beta-mixing time series tasks.
Problem

Research questions and friction points this paper is trying to address.

quantum reservoir computing
time series learning
learning guarantees
kernel methods
temporal data
Innovation

Methods, ideas, or system contributions that make the work stand out.

quantum reservoir computing
kernel methods
classical shadow tomography
time series learning
generalization guarantees
🔎 Similar Papers
No similar papers found.