๐ค AI Summary
Classical machine learning suffers from functional limitations in quantum-inspired models, particularly their inability to jointly address supervised classification and generative modeling.
Method: We propose a unified supervised-generative framework based on matrix product states (MPS), featuring a dual-objective adversarial training scheme that simultaneously optimizes classification accuracy and sample fidelity. To overcome the strict normalization constraint inherent in conventional MPS representations, we introduce a non-normalized MPS sampling strategy and multiple learnable embedding functions.
Contribution/Results: This work presents the first classical implementation of MPS for joint supervised classification and data generation. Empirical evaluation across multiple benchmark datasets demonstrates substantial improvements in both classification accuracy and generative quality, while confirming the modelโs robustness, generalization capability, and effectiveness under diverse settings.
๐ Abstract
Quantum machine learning (QML) is a rapidly expanding field that merges the principles of quantum computing with the techniques of machine learning. One of the powerful mathematical frameworks in this domain is tensor networks. These networks are used to approximate high-order tensors by contracting tensors with lower ranks. Initially developed for simulating quantum systems, tensor networks have become integral to quantum computing and, by extension, to QML. Drawing inspiration from these quantum methods, specifically the Matrix Product States (MPS), we apply them in a classical machine learning setting. Their ability to efficiently represent and manipulate complex, high-dimensional data makes them effective in a supervised learning framework. Here, we present an MPS model, in which the MPS functions as both a classifier and a generator. The dual functionality of this novel MPS model permits a strategy that enhances the traditional training of supervised MPS models. This framework is inspired by generative adversarial networks and is geared towards generating more realistic samples by reducing outliers. In addition, our contributions offer insights into the mechanics of tensor network methods for generation tasks. Specifically, we discuss alternative embedding functions and a new sampling method from non-normalized MPSs.