Domain-Aware Tensor Network Structure Search

📅 2025-05-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing tensor network architecture search (TN-AS) methods suffer from high computational overhead, reliance on numerous function evaluations, neglect of domain-specific priors, and poor structural interpretability. Method: We propose tnLLM—a novel framework that tightly integrates domain knowledge with large language model (LLM) reasoning. It employs a domain-aware prompting mechanism to directly generate physically interpretable TN architectures aligned with real-world modality relationships; performs structured architecture selection and attribution analysis within a constrained search space; and provides theoretically grounded, high-quality initializations for sampling-based state-of-the-art (SOTA) optimizers. Contribution/Results: Experiments demonstrate that tnLLM reduces function evaluations by several orders of magnitude versus SOTA while preserving optimization performance. Moreover, it significantly enhances architectural transparency and initialization quality, establishing a new paradigm for knowledge-infused, interpretable TN search.

Technology Category

Application Category

📝 Abstract
Tensor networks (TNs) provide efficient representations of high-dimensional data, yet identification of the optimal TN structures, the so called tensor network structure search (TN-SS) problem, remains a challenge. Current state-of-the-art (SOTA) algorithms are computationally expensive as they require extensive function evaluations, which is prohibitive for real-world applications. In addition, existing methods ignore valuable domain information inherent in real-world tensor data and lack transparency in their identified TN structures. To this end, we propose a novel TN-SS framework, termed the tnLLM, which incorporates domain information about the data and harnesses the reasoning capabilities of large language models (LLMs) to directly predict suitable TN structures. The proposed framework involves a domain-aware prompting pipeline which instructs the LLM to infer suitable TN structures based on the real-world relationships between tensor modes. In this way, our approach is capable of not only iteratively optimizing the objective function, but also generating domain-aware explanations for the identified structures. Experimental results demonstrate that tnLLM achieves comparable TN-SS objective function values with much fewer function evaluations compared to SOTA algorithms. Furthermore, we demonstrate that the LLM-enabled domain information can be used to find good initializations in the search space for sampling-based SOTA methods to accelerate their convergence while preserving theoretical performance guarantees.
Problem

Research questions and friction points this paper is trying to address.

Identifying optimal tensor network structures efficiently
Incorporating domain information into TN structure search
Reducing computational cost of tensor network optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses LLMs to predict tensor network structures
Incorporates domain-aware prompting for data relationships
Reduces function evaluations while maintaining performance
🔎 Similar Papers
No similar papers found.
G
Giorgos Iacovides
Department of Electrical and Electronic Engineering, Imperial College London
W
Wuyang Zhou
Department of Electrical and Electronic Engineering, Imperial College London
C
Chao Li
RIKEN AIP
Qibin Zhao
Qibin Zhao
RIKEN AIP
Machine LearningTensor DecompositionTensor Networks
Danilo Mandic
Danilo Mandic
Prof. of Machine Intelligence, Dept of Electrical and Electronic Eng., Imperial College London, UK
Machine Intelligence and Statistical Signal Proc.Biomedicine and FinanceHearables and Ear-EEGDeep RNNsTensors and Graphs