SA^2GFM: Enhancing Robust Graph Foundation Models with Structure-Aware Semantic Augmentation

📅 2025-11-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient robustness of Graph Foundation Models (GFMs) under domain noise, structural perturbations, and adversarial attacks, this paper proposes a Structure-Aware Semantic Enhancement Framework. The framework innovatively integrates hierarchical structural prior encoding, structure-guided information bottleneck compression, mixture-of-experts routing with null experts, and community-aware joint structural fine-tuning. It further introduces, for the first time, structure-aware textual prompt generation and self-supervised contrastive learning to strengthen cross-domain semantic alignment. Evaluated on node- and graph-classification tasks, the model consistently outperforms nine state-of-the-art methods. Under both random noise and adversarial perturbations, it demonstrates significantly enhanced robustness, achieving an average 5.3% improvement in cross-domain transfer accuracy.

Technology Category

Application Category

📝 Abstract
We present Graph Foundation Models (GFMs) which have made significant progress in various tasks, but their robustness against domain noise, structural perturbations, and adversarial attacks remains underexplored. A key limitation is the insufficient modeling of hierarchical structural semantics, which are crucial for generalization. In this paper, we propose SA^2GFM, a robust GFM framework that improves domain-adaptive representations through Structure-Aware Semantic Augmentation. First, we encode hierarchical structural priors by transforming entropy-based encoding trees into structure-aware textual prompts for feature augmentation. The enhanced inputs are processed by a self-supervised Information Bottleneck mechanism that distills robust, transferable representations via structure-guided compression. To address negative transfer in cross-domain adaptation, we introduce an expert adaptive routing mechanism, combining a mixture-of-experts architecture with a null expert design. For efficient downstream adaptation, we propose a fine-tuning module that optimizes hierarchical structures through joint intra- and inter-community structure learning. Extensive experiments demonstrate that SA^2GFM outperforms 9 state-of-the-art baselines in terms of effectiveness and robustness against random noise and adversarial perturbations for node and graph classification.
Problem

Research questions and friction points this paper is trying to address.

Enhancing Graph Foundation Models' robustness against domain noise and adversarial attacks
Addressing insufficient modeling of hierarchical structural semantics for generalization
Improving cross-domain adaptation by mitigating negative transfer effects
Innovation

Methods, ideas, or system contributions that make the work stand out.

Structure-aware textual prompts for feature augmentation
Self-supervised Information Bottleneck for robust representation distillation
Expert adaptive routing with mixture-of-experts and null expert
🔎 Similar Papers
No similar papers found.