Tensor Convolutional Network for Higher-Order Interaction Prediction in Sparse Tensors

📅 2025-03-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing tensor factorization (TF) methods struggle with high-order interaction prediction (e.g., top-k hyperedge prediction) in sparse tensors due to severe sparsity, leading to insufficient training of latent entity vectors and limited representational capacity. Method: We propose the Relation-Aware Tensor Convolutional Network (TCN), which first derives a multi-relational graph structure from the sparse tensor and then employs a relation-aware graph convolutional encoder to learn expressive entity representations—effectively decoupling and synergizing tensor factorization with graph neural networks (GNNs). TCN supports plug-and-play integration with mainstream TF or GNN backbones. Contribution/Results: TCN achieves strong interpretability and generalizability while significantly outperforming state-of-the-art TF and hyperedge prediction methods on multiple sparse tensor benchmarks, improving top-k prediction accuracy by 12.6%–28.3%.

Technology Category

Application Category

📝 Abstract
Many real-world data, such as recommendation data and temporal graphs, can be represented as incomplete sparse tensors where most entries are unobserved. For such sparse tensors, identifying the top-k higher-order interactions that are most likely to occur among unobserved ones is crucial. Tensor factorization (TF) has gained significant attention in various tensor-based applications, serving as an effective method for finding these top-k potential interactions. However, existing TF methods primarily focus on effectively fusing latent vectors of entities, which limits their expressiveness. Since most entities in sparse tensors have only a few interactions, their latent representations are often insufficiently trained. In this paper, we propose TCN, an accurate and compatible tensor convolutional network that integrates seamlessly with existing TF methods for predicting higher-order interactions. We design a highly effective encoder to generate expressive latent vectors of entities. To achieve this, we propose to (1) construct a graph structure derived from a sparse tensor and (2) develop a relation-aware encoder, TCN, that learns latent representations of entities by leveraging the graph structure. Since TCN complements traditional TF methods, we seamlessly integrate TCN with existing TF methods, enhancing the performance of predicting top-k interactions. Extensive experiments show that TCN integrated with a TF method outperforms competitors, including TF methods and a hyperedge prediction method. Moreover, TCN is broadly compatible with various TF methods and GNNs (Graph Neural Networks), making it a versatile solution.
Problem

Research questions and friction points this paper is trying to address.

Predicts top-k higher-order interactions in sparse tensors.
Enhances tensor factorization with a tensor convolutional network.
Generates expressive latent vectors using a graph-based encoder.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph structure derived from sparse tensors
Relation-aware encoder for latent representations
Integration with Tensor Factorization methods
🔎 Similar Papers
No similar papers found.