A Distillation-based Future-aware Graph Neural Network for Stock Trend Prediction

📅 2025-02-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Stock trend forecasting is hindered by conventional graph neural networks (GNNs), which model only historical spatiotemporal dependencies while neglecting distributional relationships between past and future patterns. To address this, we propose DishFT-GNN, a future-aware distillation framework that introduces future distribution awareness into GNN training for the first time. Leveraging teacher–student collaborative knowledge distillation, it employs distributional shifts—identified by the teacher model between historical and future patterns—as an intermediate supervisory signal to guide the student in learning future-aware spatiotemporal embeddings. The framework integrates multi-step temporal graph construction with a two-stage iterative training paradigm to implicitly capture dynamic trend evolution. Evaluated on two real-world financial datasets, DishFT-GNN significantly improves accuracy and robustness in upward/downward trend prediction, achieving state-of-the-art performance.

Technology Category

Application Category

📝 Abstract
Stock trend prediction involves forecasting the future price movements by analyzing historical data and various market indicators. With the advancement of machine learning, graph neural networks (GNNs) have been extensively employed in stock prediction due to their powerful capability to capture spatiotemporal dependencies of stocks. However, despite the efforts of various GNN stock predictors to enhance predictive performance, the improvements remain limited, as they focus solely on analyzing historical spatiotemporal dependencies, overlooking the correlation between historical and future patterns. In this study, we propose a novel distillation-based future-aware GNN framework (DishFT-GNN) for stock trend prediction. Specifically, DishFT-GNN trains a teacher model and a student model, iteratively. The teacher model learns to capture the correlation between distribution shifts of historical and future data, which is then utilized as intermediate supervision to guide the student model to learn future-aware spatiotemporal embeddings for accurate prediction. Through extensive experiments on two real-world datasets, we verify the state-of-the-art performance of DishFT-GNN.
Problem

Research questions and friction points this paper is trying to address.

Predicts stock trends using historical data
Improves GNNs with future data correlation
Enhances prediction with distillation-based framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Distillation-based GNN framework
Future-aware spatiotemporal embeddings
Teacher-student model training
🔎 Similar Papers
No similar papers found.