Using matrix-product states for time-series machine learning

πŸ“… 2024-12-20
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 1
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of modeling complex dependency structures in time series. We propose MPSTimeβ€”the first method to employ matrix product states (MPS) for end-to-end modeling of the joint probability distribution over time series. MPSTime unifies support for classification and missing-value imputation, jointly optimizing both tasks via a single log-likelihood loss while explicitly learning a complete, interpretable joint distribution. Leveraging low-rank tensor training (bond dimension Ο‡_max = 20–160), it balances expressive power and computational efficiency. Evaluated on real-world datasets spanning medicine, energy, and astronomy, MPSTime achieves state-of-the-art performance, significantly improving both dependency modeling accuracy and model interpretability. The implementation is publicly available.

Technology Category

Application Category

πŸ“ Abstract
Matrix-product states (MPS) have proven to be a versatile ansatz for modeling quantum many-body physics. For many applications, and particularly in one-dimension, they capture relevant quantum correlations in many-body wavefunctions while remaining tractable to store and manipulate on a classical computer. This has motivated researchers to also apply the MPS ansatz to machine learning (ML) problems where capturing complex correlations in datasets is also a key requirement. Here, we develop and apply an MPS-based algorithm, MPSTime, for learning a joint probability distribution underlying an observed time-series dataset, and show how it can be used to tackle important time-series ML problems, including classification and imputation. MPSTime can efficiently learn complicated time-series probability distributions directly from data, requires only moderate maximum MPS bond dimension $chi_{ m max}$, with values for our applications ranging between $chi_{ m max} = 20-160$, and can be trained for both classification and imputation tasks under a single logarithmic loss function. Using synthetic and publicly available real-world datasets, spanning applications in medicine, energy, and astronomy, we demonstrate performance competitive with state-of-the-art ML approaches, but with the key advantage of encoding the full joint probability distribution learned from the data, which is useful for analyzing and interpreting its underlying structure. This manuscript is supplemented with the release of a publicly available code package MPSTime that implements our approach. The effectiveness of the MPS-based ansatz for capturing complex correlation structures in time-series data makes it a powerful foundation for tackling challenging time-series analysis problems across science, industry, and medicine.
Problem

Research questions and friction points this paper is trying to address.

Develops MPSTime algorithm for time-series probability distribution learning
Applies MPS to classify and impute time-series data efficiently
Encodes full joint probability distribution for interpretable time-series analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Matrix-product states for time-series machine learning
MPSTime algorithm learns joint probability distributions
Moderate bond dimension enables efficient training
πŸ”Ž Similar Papers
No similar papers found.
J
Joshua B. Moore
School of Physics, The University of Sydney, NSW 2006, Australia
H
Hugo P. Stackhouse
School of Physics, The University of Sydney, NSW 2006, Australia
Ben D. Fulcher
Ben D. Fulcher
The University of Sydney
time-series analysiscomplex systemssleepcomputational neuroscienceneurophysics
S
S. Mahmoodian
Institute for Photonics and Optical Sciences (IPOS), School of Physics, The University of Sydney, NSW 2006, Australia; Centre for Engineered Quantum Systems, School of Physics, The University of Sydney, NSW, 2006, Australia