Tensor Networks Meet Neural Networks: A Survey and Future Perspectives

📅 2023-01-22
📈 Citations: 15
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses theoretical and practical bottlenecks in integrating tensor networks (TNs) with neural networks (NNs). Methodologically, it proposes a unified “Tensorized Neural Networks” (TNNs) framework, introducing the first comprehensive taxonomy of TNNs—encompassing matrix product state/tensor train (MPS/TT) decompositions, high-order tensor contractions, differentiable tensor layers, and quantum-classical hybrid modeling—to bridge NNs, TNs, and quantum circuits theoretically. Key contributions include: (1) releasing Awesome-TNN, the first open-source TNN resource repository; (2) achieving >90% parameter reduction via efficient model compression; (3) enabling multi-source fusion, multimodal pooling, and differentiable quantum circuit simulation; and (4) identifying frontier research directions—including differentiable tensor operator design and hardware-aware co-optimization. The framework establishes a novel paradigm for AI model lightweighting and quantum machine learning.
📝 Abstract
Tensor networks (TNs) and neural networks (NNs) are two fundamental data modeling approaches. TNs were introduced to solve the curse of dimensionality in large-scale tensors by converting an exponential number of dimensions to polynomial complexity. As a result, they have attracted significant attention in the fields of quantum physics and machine learning. Meanwhile, NNs have displayed exceptional performance in various applications, e.g., computer vision, natural language processing, and robotics research. Interestingly, although these two types of networks originate from different observations, they are inherently linked through the common multilinearity structure underlying both TNs and NNs, thereby motivating a significant number of intellectual developments regarding combinations of TNs and NNs. In this paper, we refer to these combinations as tensorial neural networks (TNNs), and present an introduction to TNNs in three primary aspects: network compression, information fusion, and quantum circuit simulation. Furthermore, this survey also explores methods for improving TNNs, examines flexible toolboxes for implementing TNNs, and documents TNN development while highlighting potential future directions. To the best of our knowledge, this is the first comprehensive survey that bridges the connections among NNs, TNs, and quantum circuits. We provide a curated list of TNNs at url{https://github.com/tnbar/awesome-tensorial-neural-networks}.
Problem

Research questions and friction points this paper is trying to address.

Explores combining Tensor Networks and Neural Networks for advanced data modeling.
Investigates TNNs' applications in multi-source fusion and quantum data processing.
Surveys integration of TNNs with various neural network architectures and future directions.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines Tensor Networks with Neural Networks
Explores multi-source fusion and data compression
Integrates with various neural network architectures
🔎 Similar Papers
No similar papers found.
M
Maolin Wang
City University of Hong Kong, HKSAR, China
Y
Y. Pan
Harbin Institute of Technology Shenzhen, Shenzhen, China
Zenglin Xu
Zenglin Xu
Fudan University
Machine LearningTrustworthy AIFederated LearningLarge Language ModelsTime Series Analysis
X
Xiangli Yang
University of Electronic Science and Technology of China, Chengdu, China
Guangxi Li
Guangxi Li
Assistant Researcher, Quantum Science Center of G-H-M Greater Bay Area
quantum machine learningquantum neural network
A
A. Cichocki
Systems Research Institute, Polish Academy of Sciences, Newelska 6, 01-447 Warsaw, Poland, and also with Artificial Intelligence Project, Riken, 103-0027 Tokyo, Japan