GraphPFN: A Prior-Data Fitted Graph Foundation Model

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph foundation models rely heavily on handcrafted features and struggle to capture complex structural patterns in graphs. To address this, we propose the first generative foundation model specifically designed for graph-structured data. Our approach first constructs a synthetic attributed graph prior distribution grounded in the stochastic block model and preferential attachment mechanisms, then integrates a graph-aware causal model to generate structural dependencies in node attributes and labels. Second, we adopt the LimiX architecture—enhanced with a graph neighborhood attention aggregation layer—and pretrain it on large-scale synthetic graphs. The resulting model demonstrates strong in-context learning capabilities for node-level prediction tasks. When fine-tuned on real-world graphs containing up to 50,000 nodes, it significantly outperforms both G2T-FM and task-specific GNNs trained from scratch, achieving state-of-the-art performance.

Technology Category

Application Category

📝 Abstract
Foundation models pretrained on large-scale datasets have transformed such fields as natural language processing and computer vision, but their application to graph data remains limited. Recently emerged graph foundation models, such as G2T-FM, utilize tabular foundation models for graph tasks and were shown to significantly outperform prior attempts to create GFMs. However, these models primarily rely on hand-crafted graph features, limiting their ability to learn complex graph-specific patterns. In this work, we propose GraphPFN: a prior-data fitted network for node-level prediction. First, we design a prior distribution of synthetic attributed graphs. For graph structure generation, we use a novel combination of multiple stochastic block models and a preferential attachment process. We then apply graph-aware structured causal models to generate node attributes and targets. This procedure allows us to efficiently generate a wide range of realistic graph datasets. Then, we augment the tabular foundation model LimiX with attention-based graph neighborhood aggregation layers and train it on synthetic graphs sampled from our prior, allowing the model to capture graph structural dependencies not present in tabular data. On diverse real-world graph datasets with up to 50,000 nodes, GraphPFN shows strong in-context learning performance and achieves state-of-the-art results after finetuning, outperforming both G2T-FM and task-specific GNNs trained from scratch on most datasets. More broadly, our work demonstrates that pretraining on synthetic graphs from a well-designed prior distribution is an effective strategy for building graph foundation models.
Problem

Research questions and friction points this paper is trying to address.

Addressing limited graph foundation model capabilities with synthetic data
Overcoming reliance on hand-crafted features in graph learning
Enhancing graph structural dependency capture through prior distribution
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses synthetic graph generation with stochastic block models
Augments tabular foundation model with graph attention layers
Pretrains model on synthetic graphs from designed prior distribution
🔎 Similar Papers
No similar papers found.
D
Dmitry Eremeev
HSE University, Yandex Research
Oleg Platonov
Oleg Platonov
ML Researcher, Yandex Research
Deep LearningGraph Machine LearningNatural Language Processing
G
Gleb Bazhenov
HSE University, Yandex Research
Artem Babenko
Artem Babenko
Yandex
L
Liudmila Prokhorenkova
Yandex Research