LOBSTUR: A Local Bootstrap Framework for Tuning Unsupervised Representations in Graph Neural Networks

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Unsupervised graph neural networks (GNNs) suffer from high sensitivity to hyperparameters and the absence of reliable, label-free model selection criteria. To address this, we propose a local bootstrap-based representation tuning framework: it generates perturbed graphs via edge and node feature resampling that preserves local graph structure, and employs canonical correlation analysis (CCA) to quantify embedding consistency across perturbations—enabling fully unsupervised model selection and hyperparameter optimization. This work is the first to systematically adapt bootstrapping to graph-structured learning, introducing a novel local-aware perturbation mechanism and an unsupervised embedding consistency evaluation paradigm, thereby eliminating reliance on downstream task validation or manual intervention. On standard benchmarks, our method improves classification accuracy by 65.9% over random search, and demonstrates strong generalizability and practical utility in real-world scenarios.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) are increasingly used in conjunction with unsupervised learning techniques to learn powerful node representations, but their deployment is hindered by their high sensitivity to hyperparameter tuning and the absence of established methodologies for selecting the optimal models. To address these challenges, we propose LOBSTUR-GNN ({f Lo}cal {f B}oot{f s}trap for {f T}uning {f U}nsupervised {f R}epresentations in GNNs) i), a novel framework designed to adapt bootstrapping techniques for unsupervised graph representation learning. LOBSTUR-GNN tackles two main challenges: (a) adapting the bootstrap edge and feature resampling process to account for local graph dependencies in creating alternative versions of the same graph, and (b) establishing robust metrics for evaluating learned representations without ground-truth labels. Using locally bootstrapped resampling and leveraging Canonical Correlation Analysis (CCA) to assess embedding consistency, LOBSTUR provides a principled approach for hyperparameter tuning in unsupervised GNNs. We validate the effectiveness and efficiency of our proposed method through extensive experiments on established academic datasets, showing an 65.9% improvement in the classification accuracy compared to an uninformed selection of hyperparameters. Finally, we deploy our framework on a real-world application, thereby demonstrating its validity and practical utility in various settings. footnote{The code is available at href{https://github.com/sowonjeong/lobstur-graph-bootstrap}{github.com/sowonjeong/lobstur-graph-bootstrap}.}
Problem

Research questions and friction points this paper is trying to address.

Addresses hyperparameter sensitivity in unsupervised GNNs
Develops robust metrics for evaluating unsupervised graph representations
Improves classification accuracy via local bootstrap resampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Local bootstrap for unsupervised GNN tuning
Edge and feature resampling with local dependencies
CCA-based embedding consistency evaluation
🔎 Similar Papers
No similar papers found.