Unconditional flow-based time series generation with equivariance-regularised latent spaces

๐Ÿ“… 2026-01-30
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work proposes an equivariant latent space modeling approach to enhance both the quality and efficiency of time series generation. Addressing the challenge that existing generative models often fail to preserve structural consistency under transformations such as translation and amplitude scaling, the method introduces equivariant regularization within the low-dimensional latent space of a pretrained autoencoder. A tailored equivariant loss function is designed to explicitly incorporate geometric inductive biases into a flow-matchingโ€“based generative framework. Evaluated on multiple real-world time series datasets, the proposed approach substantially outperforms diffusion-based baselines in standard generation metrics while achieving sampling speeds several orders of magnitude faster, effectively balancing high-fidelity synthesis with computational efficiency.

Technology Category

Application Category

๐Ÿ“ Abstract
Flow-based models have proven successful for time-series generation, particularly when defined in lower-dimensional latent spaces that enable efficient sampling. However, how to design latent representations with desirable equivariance properties for time-series generative modelling remains underexplored. In this work, we propose a latent flow-matching framework in which equivariance is explicitly encouraged through a simple regularisation of a pre-trained autoencoder. Specifically, we introduce an equivariance loss that enforces consistency between transformed signals and their reconstructions, and use it to fine-tune latent spaces with respect to basic time-series transformations such as translation and amplitude scaling. We show that these equivariance-regularised latent spaces improve generation quality while preserving the computational advantages of latent flow models. Experiments on multiple real-world datasets demonstrate that our approach consistently outperforms existing diffusion-based baselines in standard time-series generation metrics, while achieving orders-of-magnitude faster sampling. These results highlight the practical benefits of incorporating geometric inductive biases into latent generative models for time series.
Problem

Research questions and friction points this paper is trying to address.

time series generation
equivariance
latent space
flow-based models
generative modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

equivariance regularization
latent flow matching
time series generation
geometric inductive bias
efficient sampling
๐Ÿ”Ž Similar Papers
No similar papers found.
C
Camilo Carvajal Reyes
Imperial College London
Felipe Tobar
Felipe Tobar
Imperial College London
Signal ProcessingMachine Learning