🤖 AI Summary
Unsupervised multivariate time series representation learning suffers from weak temporal consistency and insufficient semantic information in learned representations. To address this, we propose a dual-masked autoencoder framework: masks are applied jointly at both the input layer and latent space, enabling co-optimization of masked value reconstruction and latent representation prediction. A momentum-updated teacher encoder is introduced, coupled with feature-level alignment constraints, to jointly optimize observable attributes and latent structural patterns. This design significantly enhances temporal robustness and semantic discriminability of the learned representations. Extensive experiments demonstrate that our method consistently outperforms state-of-the-art unsupervised baselines across diverse downstream tasks—including classification, regression, and forecasting—on multiple benchmark datasets. Moreover, the learned representations exhibit strong transferability and generalization capability.
📝 Abstract
Unsupervised multivariate time series (MTS) representation learning aims to extract compact and informative representations from raw sequences without relying on labels, enabling efficient transfer to diverse downstream tasks. In this paper, we propose Dual-Masked Autoencoder (DMAE), a novel masked time-series modeling framework for unsupervised MTS representation learning. DMAE formulates two complementary pretext tasks: (1) reconstructing masked values based on visible attributes, and (2) estimating latent representations of masked features, guided by a teacher encoder. To further improve representation quality, we introduce a feature-level alignment constraint that encourages the predicted latent representations to align with the teacher's outputs. By jointly optimizing these objectives, DMAE learns temporally coherent and semantically rich representations. Comprehensive evaluations across classification, regression, and forecasting tasks demonstrate that our approach achieves consistent and superior performance over competitive baselines.