Are Large Language Models Good Temporal Graph Learners?

📅 2025-06-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) remain underexplored for real-world temporal graph learning, particularly for link prediction. Method: This paper proposes Temporal Graph Talker (TGTalker), the first framework to systematically evaluate LLMs on link prediction across five real-world temporal networks. It encodes graph structure and temporal neighborhood information into natural language prompts, incorporates temporal proximity bias modeling, and enables end-to-end LLM inference via prompt engineering. Contributions/Results: (1) It bridges LLMs and practical temporal graph learning for the first time; (2) it yields interpretable, text-based predictions; and (3) it achieves performance on par with—and consistently surpasses—state-of-the-art temporal graph neural networks (e.g., TGN, HTGN), establishing a novel paradigm for temporal graph analysis.

Technology Category

Application Category

📝 Abstract
Large Language Models (LLMs) have recently driven significant advancements in Natural Language Processing and various other applications. While a broad range of literature has explored the graph-reasoning capabilities of LLMs, including their use of predictors on graphs, the application of LLMs to dynamic graphs -- real world evolving networks -- remains relatively unexplored. Recent work studies synthetic temporal graphs generated by random graph models, but applying LLMs to real-world temporal graphs remains an open question. To address this gap, we introduce Temporal Graph Talker (TGTalker), a novel temporal graph learning framework designed for LLMs. TGTalker utilizes the recency bias in temporal graphs to extract relevant structural information, converted to natural language for LLMs, while leveraging temporal neighbors as additional information for prediction. TGTalker demonstrates competitive link prediction capabilities compared to existing Temporal Graph Neural Network (TGNN) models. Across five real-world networks, TGTalker performs competitively with state-of-the-art temporal graph methods while consistently outperforming popular models such as TGN and HTGN. Furthermore, TGTalker generates textual explanations for each prediction, thus opening up exciting new directions in explainability and interpretability for temporal link prediction. The code is publicly available at https://github.com/shenyangHuang/TGTalker.
Problem

Research questions and friction points this paper is trying to address.

Exploring LLMs' capability on real-world dynamic graphs
Developing a framework for temporal graph learning with LLMs
Enhancing explainability in temporal link prediction tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

TGTalker extracts structural info as natural language
Leverages temporal neighbors for enhanced prediction
Generates textual explanations for each prediction
🔎 Similar Papers
No similar papers found.