🤖 AI Summary
Existing CNN-based diffusion models (e.g., UNet) for GPS trajectory generation suffer from limited local receptive fields and insufficient modeling capacity, leading to spatial misalignment and loss of street-level geometric details. To address this, we propose the first Transformer-based diffusion framework specifically designed for trajectory generation. Our method introduces a novel spatiotemporal-aware positional encoding scheme, systematically compares latitude-longitude embedding versus conventional positional embedding for coordinate representation, and supports multi-scale trajectory synthesis. Extensive experiments on two real-world trajectory datasets demonstrate that our approach significantly reduces trajectory displacement error and enhances fine-grained spatial fidelity. It achieves superior performance over state-of-the-art methods across quantitative metrics—including Fréchet Inception Distance (FID), Dynamic Time Warping (DTW)—and human evaluation. This work establishes a new paradigm for low-cost, privacy-preserving trajectory synthesis.
📝 Abstract
The widespread use of GPS devices has driven advances in spatiotemporal data mining, enabling machine learning models to simulate human decision making and generate realistic trajectories, addressing both data collection costs and privacy concerns. Recent studies have shown the promise of diffusion models for high-quality trajectory generation. However, most existing methods rely on convolution based architectures (e.g. UNet) to predict noise during the diffusion process, which often results in notable deviations and the loss of fine-grained street-level details due to limited model capacity. In this paper, we propose Trajectory Transformer, a novel model that employs a transformer backbone for both conditional information embedding and noise prediction. We explore two GPS coordinate embedding strategies, location embedding and longitude-latitude embedding, and analyze model performance at different scales. Experiments on two real-world datasets demonstrate that Trajectory Transformer significantly enhances generation quality and effectively alleviates the deviation issues observed in prior approaches.