🤖 AI Summary
Stock trend forecasting is hindered by conventional graph neural networks (GNNs), which model only historical spatiotemporal dependencies while neglecting distributional relationships between past and future patterns. To address this, we propose DishFT-GNN, a future-aware distillation framework that introduces future distribution awareness into GNN training for the first time. Leveraging teacher–student collaborative knowledge distillation, it employs distributional shifts—identified by the teacher model between historical and future patterns—as an intermediate supervisory signal to guide the student in learning future-aware spatiotemporal embeddings. The framework integrates multi-step temporal graph construction with a two-stage iterative training paradigm to implicitly capture dynamic trend evolution. Evaluated on two real-world financial datasets, DishFT-GNN significantly improves accuracy and robustness in upward/downward trend prediction, achieving state-of-the-art performance.
📝 Abstract
Stock trend prediction involves forecasting the future price movements by analyzing historical data and various market indicators. With the advancement of machine learning, graph neural networks (GNNs) have been extensively employed in stock prediction due to their powerful capability to capture spatiotemporal dependencies of stocks. However, despite the efforts of various GNN stock predictors to enhance predictive performance, the improvements remain limited, as they focus solely on analyzing historical spatiotemporal dependencies, overlooking the correlation between historical and future patterns. In this study, we propose a novel distillation-based future-aware GNN framework (DishFT-GNN) for stock trend prediction. Specifically, DishFT-GNN trains a teacher model and a student model, iteratively. The teacher model learns to capture the correlation between distribution shifts of historical and future data, which is then utilized as intermediate supervision to guide the student model to learn future-aware spatiotemporal embeddings for accurate prediction. Through extensive experiments on two real-world datasets, we verify the state-of-the-art performance of DishFT-GNN.