MaGNet: A Mamba Dual-Hypergraph Network for Stock Prediction via Temporal-Causal and Global Relational Learning

📅 2025-10-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Stock trend forecasting faces challenges including high market volatility, complex temporal dynamics, and difficulty in modeling cross-market interdependencies. To address these, we propose MAGE—a novel framework featuring: (1) a bidirectional Mamba architecture integrated with a sparse Mixture-of-Experts (MoE) module to enhance long-range temporal modeling; (2) a dual-dimensional spatiotemporal attention mechanism operating over both feature and stock dimensions, effectively decoupling local temporal dependencies from global market structure; and (3) a joint learning paradigm combining temporal causal hypergraphs and global probabilistic hypergraphs, with multi-source relational embeddings fused via Jensen–Shannon divergence–weighted aggregation. Evaluated on six major stock indices, MAGE achieves significant improvements over state-of-the-art methods in prediction accuracy. Empirical backtesting demonstrates superior annualized returns and reduced maximum drawdown, confirming its robustness and risk-controlling capability.

Technology Category

Application Category

📝 Abstract
Stock trend prediction is crucial for profitable trading strategies and portfolio management yet remains challenging due to market volatility, complex temporal dynamics and multifaceted inter-stock relationships. Existing methods struggle to effectively capture temporal dependencies and dynamic inter-stock interactions, often neglecting cross-sectional market influences, relying on static correlations, employing uniform treatments of nodes and edges, and conflating diverse relationships. This work introduces MaGNet, a novel Mamba dual-hyperGraph Network for stock prediction, integrating three key innovations: (1) a MAGE block, which leverages bidirectional Mamba with adaptive gating mechanisms for contextual temporal modeling and integrates a sparse Mixture-of-Experts layer to enable dynamic adaptation to diverse market conditions, alongside multi-head attention for capturing global dependencies; (2) Feature-wise and Stock-wise 2D Spatiotemporal Attention modules enable precise fusion of multivariate features and cross-stock dependencies, effectively enhancing informativeness while preserving intrinsic data structures, bridging temporal modeling with relational reasoning; and (3) a dual hypergraph framework consisting of the Temporal-Causal Hypergraph (TCH) that captures fine-grained causal dependencies with temporal constraints, and Global Probabilistic Hypergraph (GPH) that models market-wide patterns through soft hyperedge assignments and Jensen-Shannon Divergence weighting mechanism, jointly disentangling localized temporal influences from instantaneous global structures for multi-scale relational learning. Extensive experiments on six major stock indices demonstrate MaGNet outperforms state-of-the-art methods in both superior predictive performance and exceptional investment returns with robust risk management capabilities. Codes available at: https://github.com/PeilinTime/MaGNet.
Problem

Research questions and friction points this paper is trying to address.

Capturing complex temporal dependencies and dynamic inter-stock interactions
Modeling cross-sectional market influences and disentangling diverse relationships
Addressing market volatility through multi-scale temporal-causal and global learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bidirectional Mamba with adaptive gating for temporal modeling
Dual hypergraph framework disentangling temporal and global structures
2D spatiotemporal attention modules for feature and stock fusion
🔎 Similar Papers
No similar papers found.
P
Peilin Tan
University of California, San Diego, La Jolla, CA, USA
C
Chuanqi Shi
University of California, San Diego, La Jolla, CA, USA
D
Dian Tu
University of California, San Diego, La Jolla, CA, USA
Liang Xie
Liang Xie
Wuhan University of Technology
Time Series ForecastingCross-modal Learning