🤖 AI Summary
Financial time series forecasting requires jointly modeling short-term fluctuations and long-term dependencies; however, existing Transformer-based approaches are predominantly limited to univariate, single-task settings and fail to capture cross-stock industrial synergies. To address this, we propose Market-Transformer, a novel framework for joint multi-stock price forecasting. First, we design an industry-aware multidimensional feature selection mechanism that explicitly incorporates inter-stock price correlations. Second, we introduce a tightly integrated architecture combining Time2Vec with the Transformer encoder—replacing conventional positional encoding—to enhance temporal pattern representation. Third, we employ a hyperparameter co-optimization strategy to improve generalization. Extensive experiments on multi-stock price prediction demonstrate that Market-Transformer significantly outperforms state-of-the-art methods, validating the effectiveness and necessity of correlation-driven multivariate modeling in financial time series forecasting.
📝 Abstract
Financial prediction is a complex and challenging task of time series analysis and signal processing, expected to model both short-term fluctuations and long-term temporal dependencies. Transformers have remarkable success mostly in natural language processing using attention mechanism, which also influenced the time series community. The ability to capture both short and long-range dependencies helps to understand the financial market and to recognize price patterns, leading to successful applications of Transformers in stock prediction. Although, the previous research predominantly focuses on individual features and singular predictions, that limits the model's ability to understand broader market trends. In reality, within sectors such as finance and technology, companies belonging to the same industry often exhibit correlated stock price movements. In this paper, we develop a novel neural network architecture by integrating Time2Vec with the Encoder of the Transformer model. Based on the study of different markets, we propose a novel correlation feature selection method. Through a comprehensive fine-tuning of multiple hyperparameters, we conduct a comparative analysis of our results against benchmark models. We conclude that our method outperforms other state-of-the-art encoding methods such as positional encoding, and we also conclude that selecting correlation features enhance the accuracy of predicting multiple stock prices.