🤖 AI Summary
Existing self-supervised time series representation learning methods—such as masked modeling—are vulnerable to input noise and confounding variables. To address this, we propose Time Series Joint Embedding Predictive Architecture (TS-JEPA), the first framework to adapt the JEPA paradigm to time series. TS-JEPA jointly optimizes future segment prediction and contrastive representation learning in a latent space, eliminating reliance on raw-input reconstruction and thereby significantly enhancing robustness. Its unified architecture natively supports both classification and forecasting tasks. Evaluated across multiple standard benchmarks, TS-JEPA achieves state-of-the-art or competitive performance, demonstrating strong generalization and balanced multi-task capability. This work establishes a novel paradigm for developing robust, general-purpose time series foundation models.
📝 Abstract
Self-supervised learning has seen great success recently in unsupervised representation learning, enabling breakthroughs in natural language and image processing. However, these methods often rely on autoregressive and masked modeling, which aim to reproduce masked information in the input, which can be vulnerable to the presence of noise or confounding variables. To address this problem, Joint-Embedding Predictive Architectures (JEPA) has been introduced with the aim to perform self-supervised learning in the latent space. To leverage these advancements in the domain of time series, we introduce Time Series JEPA (TS-JEPA), an architecture specifically adapted for time series representation learning. We validate TS-JEPA on both classification and forecasting, showing that it can match or surpass current state-of-the-art baselines on different standard datasets. Notably, our approach demonstrates a strong performance balance across diverse tasks, indicating its potential as a robust foundation for learning general representations. Thus, this work lays the groundwork for developing future time series foundation models based on Joint Embedding.