🤖 AI Summary
This work addresses the scalability bottleneck of quantum machine learning in modeling complex multivariate time series. To overcome this challenge, the authors propose a parameter-efficient hybrid architecture that integrates classical temporal windowing with a quantum convolutional neural network. By sharing quantum circuits across the time dimension, the model effectively captures long-range dependencies while significantly reducing the number of trainable parameters. The shared quantum convolutional kernels process multidimensional temporal signals in a resource-efficient manner, enhancing both performance and parameter efficiency. Experimental results demonstrate that the proposed method outperforms all classical baselines on the NARMA sequence and high-dimensional EEG datasets, achieving superior accuracy with fewer parameters—particularly excelling in data-scarce scenarios.
📝 Abstract
Quantum machine learning models for sequential data face scalability challenges with complex multivariate signals. We introduce the Hybrid Quantum Temporal Convolutional Network (HQTCN), which combines classical temporal windowing with a quantum convolutional neural network core. By applying a shared quantum circuit across temporal windows, HQTCN captures long-range dependencies while achieving significant parameter reduction. Evaluated on synthetic NARMA sequences and high-dimensional EEG time-series, HQTCN performs competitively with classical baselines on univariate data and outperforms all baselines on multivariate tasks. The model demonstrates particular strength under data-limited conditions, maintaining high performance with substantially fewer parameters than conventional approaches. These results establish HQTCN as a parameter-efficient approach for multivariate time-series analysis.