🤖 AI Summary
To address the challenge of simultaneously ensuring fidelity, controllability, and diversity in time-series data augmentation, this paper proposes a Transformer-enhanced Variational Recurrent Neural Network (T-VRNN). The model embeds interpretable geometric transformations—such as jittering and amplitude warping—into the latent space, enabling fine-grained, compositional generation. It preserves both statistical properties and dynamic structures of original sequences while supporting flexible control—from simple perturbations to complex pattern reconstruction. Extensive experiments on multiple real-world datasets demonstrate that the augmented data significantly improves forecasting accuracy (+3.2% on average), classification F1-score (+2.8%), and anomaly detection AUC (+4.1%). Moreover, the generated samples achieve superior performance over conventional hand-crafted transformations and state-of-the-art generative models under dynamic time warping (DTW) and mean squared error (MSE) metrics, validating the method’s effectiveness and generalizability.
📝 Abstract
Data augmentation is gaining importance across various aspects of time series analysis, from forecasting to classification and anomaly detection tasks. We introduce the Latent Generative Transformer Augmentation (L-GTA) model, a generative approach using a transformer-based variational recurrent autoencoder. This model uses controlled transformations within the latent space of the model to generate new time series that preserve the intrinsic properties of the original dataset. L-GTA enables the application of diverse transformations, ranging from simple jittering to magnitude warping, and combining these basic transformations to generate more complex synthetic time series datasets. Our evaluation of several real-world datasets demonstrates the ability of L-GTA to produce more reliable, consistent, and controllable augmented data. This translates into significant improvements in predictive accuracy and similarity measures compared to direct transformation methods.