JiuTian Chuanliu: A Large Spatiotemporal Model for General-purpose Dynamic Urban Sensing

πŸ“… 2025-10-26
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing approaches typically target isolated tasks, limiting their capacity to comprehensively model human mobility and resulting in poor generalizability. To address this, we propose GDHME (Generalizable Dynamic Human Mobility Embedding), the first framework integrating continuous-time dynamic graph encoding with autoregressive self-supervised learning to jointly and finely characterize interactions among individuals, geographic regions, and time. Leveraging real-world cellular trajectory data, GDHME employs continuous-time graph neural networks and dynamic graph representation learning to enable cross-task knowledge transfer and uncover latent semantic patterns. Offline experiments demonstrate its ability to automatically learn discriminative node representations. GDHME has been integrated into the Jiutian Chuanliu large-scale foundation model and showcased at China Mobile’s 2023 Global Partner Conference, validating its effectiveness and broad applicability across multi-city urban sensing tasks.

Technology Category

Application Category

πŸ“ Abstract
As a window for urban sensing, human mobility contains rich spatiotemporal information that reflects both residents' behavior preferences and the functions of urban areas. The analysis of human mobility has attracted the attention of many researchers. However, existing methods often address specific tasks from a particular perspective, leading to insufficient modeling of human mobility and limited applicability of the learned knowledge in various downstream applications. To address these challenges, this paper proposes to push massive amounts of human mobility data into a spatiotemporal model, discover latent semantics behind mobility behavior and support various urban sensing tasks. Specifically, a large-scale and widely covering human mobility data is collected through the ubiquitous base station system and a framework named General-purpose and Dynamic Human Mobility Embedding (GDHME) for urban sensing is introduced. The framework follows the self-supervised learning idea and contains two major stages. In stage 1, GDHME treats people and regions as nodes within a dynamic graph, unifying human mobility data as people-region-time interactions. An encoder operating in continuous-time dynamically computes evolving node representations, capturing dynamic states for both people and regions. Moreover, an autoregressive self-supervised task is specially designed to guide the learning of the general-purpose node embeddings. In stage 2, these representations are utilized to support various tasks. To evaluate the effectiveness of our GDHME framework, we further construct a multi-task urban sensing benchmark. Offline experiments demonstrate GDHME's ability to automatically learn valuable node features from vast amounts of data. Furthermore, our framework is used to deploy the JiuTian ChuanLiu Big Model, a system that has been presented at the 2023 China Mobile Worldwide Partner Conference.
Problem

Research questions and friction points this paper is trying to address.

Modeling human mobility from spatiotemporal data for urban sensing
Learning general-purpose embeddings to support diverse downstream tasks
Capturing dynamic states of people and regions through continuous-time encoding
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic graph modeling of people-region-time interactions
Continuous-time encoder for evolving node representations
Autoregressive self-supervised learning for general embeddings
πŸ”Ž Similar Papers
No similar papers found.
Liangzhe Han
Liangzhe Han
Beihang University
Data MiningGraph Neural NetworkSpatio-temporal Learning
Leilei Sun
Leilei Sun
Beihang University
Data MiningMachine LearningGraph Learning
T
Tongyu Zhu
CCSE, Beihang University, China
Tao Tao
Tao Tao
University of Maryland
J
Jibin Wang
China Mobile Group IT Center, China
W
Weifeng Lv
CCSE, Beihang University, China