🤖 AI Summary
Modeling high-order correlations in hypergraphs remains challenging, and existing contrastive learning methods often disrupt the underlying topological structure. Method: This paper proposes HyFi, a fine-grained contrastive learning framework for hypergraphs. It (1) employs lightweight topologically preserving augmentation via node feature perturbation; (2) introduces “weak positive pairs” to refine positive sample identification; and (3) explicitly models shared semantic representations among nodes co-occurring in the same hyperedge—first such formulation in hypergraph contrastive learning. Contribution/Results: HyFi establishes the first fine-grained contrastive learning paradigm for hypergraphs, eliminating reliance on structural perturbations. Evaluated on node classification across 10 benchmark datasets, HyFi achieves significantly lower average rank than both supervised and unsupervised baselines, while offering faster training and reduced GPU memory consumption. The code is publicly available.
📝 Abstract
Hypergraphs provide a superior modeling frame-work for representing complex multidimensional relationships in the context of real-world interactions that often occur in groups, overcoming the limitations of traditional homogeneous graphs. However, there have been few studies on hypergraph-based contrastive learning, and existing graph-based contrastive learning methods have not been able to fully exploit the high-order correlation information in hypergraphs. Here, we propose a Hypergraph Fine-grained contrastive learning (HyFi) method designed to exploit the complex high-dimensional information inherent in hypergraphs. While avoiding traditional graph augmentation methods that corrupt the hypergraph topology, the proposed method provides a simple and efficient learning augmentation function by adding noise to node features. Furthermore, we expands beyond the traditional dichotomous relationship between positive and negative samples in contrastive learning by introducing a new relationship of weak positives. It demonstrates the importance of fine-graining positive samples in contrastive learning. Therefore, HyFi is able to produce high-quality embeddings, and outperforms both supervised and unsupervised baselines in average rank on node classification across 10 datasets. Our approach effectively exploits high-dimensional hypergraph information, shows significant improvement over existing graph-based contrastive learning methods, and is efficient in terms of training speed and GPU memory cost. The source code is available at https://github.com/Noverse0/HyFi.git.