Closer through commonality: Enhancing hypergraph contrastive learning with shared groups

📅 2024-12-15
🏛️ BigData Congress [Services Society]
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Modeling high-order correlations in hypergraphs remains challenging, and existing contrastive learning methods often disrupt the underlying topological structure. Method: This paper proposes HyFi, a fine-grained contrastive learning framework for hypergraphs. It (1) employs lightweight topologically preserving augmentation via node feature perturbation; (2) introduces “weak positive pairs” to refine positive sample identification; and (3) explicitly models shared semantic representations among nodes co-occurring in the same hyperedge—first such formulation in hypergraph contrastive learning. Contribution/Results: HyFi establishes the first fine-grained contrastive learning paradigm for hypergraphs, eliminating reliance on structural perturbations. Evaluated on node classification across 10 benchmark datasets, HyFi achieves significantly lower average rank than both supervised and unsupervised baselines, while offering faster training and reduced GPU memory consumption. The code is publicly available.

Technology Category

Application Category

📝 Abstract
Hypergraphs provide a superior modeling frame-work for representing complex multidimensional relationships in the context of real-world interactions that often occur in groups, overcoming the limitations of traditional homogeneous graphs. However, there have been few studies on hypergraph-based contrastive learning, and existing graph-based contrastive learning methods have not been able to fully exploit the high-order correlation information in hypergraphs. Here, we propose a Hypergraph Fine-grained contrastive learning (HyFi) method designed to exploit the complex high-dimensional information inherent in hypergraphs. While avoiding traditional graph augmentation methods that corrupt the hypergraph topology, the proposed method provides a simple and efficient learning augmentation function by adding noise to node features. Furthermore, we expands beyond the traditional dichotomous relationship between positive and negative samples in contrastive learning by introducing a new relationship of weak positives. It demonstrates the importance of fine-graining positive samples in contrastive learning. Therefore, HyFi is able to produce high-quality embeddings, and outperforms both supervised and unsupervised baselines in average rank on node classification across 10 datasets. Our approach effectively exploits high-dimensional hypergraph information, shows significant improvement over existing graph-based contrastive learning methods, and is efficient in terms of training speed and GPU memory cost. The source code is available at https://github.com/Noverse0/HyFi.git.
Problem

Research questions and friction points this paper is trying to address.

Enhancing hypergraph contrastive learning
Exploiting high-order correlation information
Introducing weak positives in contrastive learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hypergraph Fine-grained contrastive learning
Noise addition for feature augmentation
Introduction of weak positives relation
🔎 Similar Papers
No similar papers found.
D
Daeyoung Roh
Graduate School of Data Science, KAIST, Republic of Korea
Donghee Han
Donghee Han
KAIST GSDS
Daehee Kim
Daehee Kim
NAVER Cloud
Deep LearningVision and LanguageOptical Character RecognitionDomain Generalization
K
Keejun Han
School of Computer Engineering, Hansung University, Republic of Korea
M
M. Yi
Department of Industrial and Systems Engineering, KAIST, Republic of Korea