Network Embedding With Completely-Imbalanced Labels

📅 2020-07-07
🏛️ IEEE Transactions on Knowledge and Data Engineering
📈 Citations: 55
Influential: 4
📄 PDF
🤖 AI Summary
To address the challenge of completely imbalanced class labels in network embedding—where certain classes have no labeled nodes—this paper proposes two semi-supervised approaches: the shallow model RSDNE and the graph neural network RECT. RSDNE preserves intra-class compactness and inter-class separability via adaptive similarity constraints; RECT is the first to incorporate class semantic knowledge, jointly modeling node features and multi-label information to enhance discriminability for zero-shot classes. This work constitutes the first systematic study of network embedding under fully imbalanced label settings. Extensive experiments on multiple real-world datasets demonstrate that both methods significantly outperform existing semi-supervised embedding baselines, particularly achieving effective knowledge transfer to unlabeled (zero-shot) classes. Substantial improvements are observed in downstream node classification and clustering tasks.
📝 Abstract
Network embedding, aiming to project a network into a low-dimensional space, is increasingly becoming a focus of network research. Semi-supervised network embedding takes advantage of labeled data, and has shown promising performance. However, existing semi-supervised methods would get unappealing results in the completely-imbalanced label setting where some classes have no labeled nodes at all. To alleviate this, we propose two novel semi-supervised network embedding methods. The first one is a shallow method named RSDNE. Specifically, to benefit from the completely-imbalanced labels, RSDNE guarantees both intra-class similarity and inter-class dissimilarity in an approximate way. The other method is RECT which is a new class of graph neural networks. Different from RSDNE, to benefit from the completely-imbalanced labels, RECT explores the class-semantic knowledge. This enables RECT to handle networks with node features and multi-label setting. Experimental results on several real-world datasets demonstrate the superiority of the proposed methods.
Problem

Research questions and friction points this paper is trying to address.

Addresses network embedding with completely-imbalanced labels
Improves performance when some classes lack labeled nodes
Proposes methods for intra-class similarity and inter-class dissimilarity
Innovation

Methods, ideas, or system contributions that make the work stand out.

RSDNE ensures intra-class similarity and inter-class dissimilarity
RECT explores class-semantic knowledge for imbalanced labels
Both methods handle completely-imbalanced label settings effectively
🔎 Similar Papers
No similar papers found.
Z
Zheng Wang
Department of Computer Science and Technology, University of Science and Technology Beijing, Beijing, China
X
Xiaojun Ye
School of Software, Tsinghua University, Beijing, China
Chaokun Wang
Chaokun Wang
Tsinghua University
DatabaseMultimediaSocial Networks
Jian Cui
Jian Cui
University of Illinois Urbana-Champaign
Cybersecurity
Philip S. Yu
Philip S. Yu
Professor of Computer Science, University of Illinons at Chicago
Data miningDatabasePrivacy