TacticExpert: Spatial-Temporal Graph Language Model for Basketball Tactics

📅 2025-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Basketball tactical modeling faces three core challenges: difficulty in capturing complex spatiotemporal dependencies from historical player trajectories, inability to model fine-grained long-range interactions among heterogeneous players, and poor generalization—particularly in downstream tasks and zero-shot scenarios. To address these, we propose the Spatiotemporal Propagation Symmetry-Aware Graph Transformer, the first to incorporate symmetry priors directly into the attention mechanism. We further design a Tactical Expert Mixture module and a lightweight Graph-Anchored Large Language Model architecture, enabling synergistic integration of GNNs, Transformers, contrastive learning, and LLMs. Our framework adopts a dense-training–sparse-inference paradigm. Under unified pretraining across multiple datasets, it achieves significant gains in tactical event prediction accuracy, enhanced zero-shot transferability, 2.4× higher inference efficiency, and strong interpretability.

Technology Category

Application Category

📝 Abstract
The core challenge in basketball tactic modeling lies in efficiently extracting complex spatial-temporal dependencies from historical data and accurately predicting various in-game events. Existing state-of-the-art (SOTA) models, primarily based on graph neural networks (GNNs), encounter difficulties in capturing long-term, long-distance, and fine-grained interactions among heterogeneous player nodes, as well as in recognizing interaction patterns. Additionally, they exhibit limited generalization to untrained downstream tasks and zero-shot scenarios. In this work, we propose a Spatial-Temporal Propagation Symmetry-Aware Graph Transformer for fine-grained game modeling. This architecture explicitly captures delay effects in the spatial space to enhance player node representations across discrete-time slices, employing symmetry-invariant priors to guide the attention mechanism. We also introduce an efficient contrastive learning strategy to train a Mixture of Tactics Experts module, facilitating differentiated modeling of offensive tactics. By integrating dense training with sparse inference, we achieve a 2.4x improvement in model efficiency. Moreover, the incorporation of Lightweight Graph Grounding for Large Language Models enables robust performance in open-ended downstream tasks and zero-shot scenarios, including novel teams or players. The proposed model, TacticExpert, delineates a vertically integrated large model framework for basketball, unifying pretraining across multiple datasets and downstream prediction tasks. Fine-grained modeling modules significantly enhance spatial-temporal representations, and visualization analyzes confirm the strong interpretability of the model.
Problem

Research questions and friction points this paper is trying to address.

Extracting spatial-temporal dependencies in basketball tactics.
Improving generalization in untrained and zero-shot scenarios.
Enhancing efficiency and interpretability of game modeling.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spatial-Temporal Propagation Symmetry-Aware Graph Transformer
Contrastive learning for Mixture of Tactics Experts
Lightweight Graph Grounding for Large Language Models
🔎 Similar Papers
No similar papers found.