๐ค AI Summary
Existing time-series generation methods struggle to model long-range temporal dependencies and complex inter-channel correlations. To address this, we propose a diffusion-based generative framework built upon the Mamba architecture. Our key contributions are: (1) Lag Fusion Mamba, the first approach to embed lag-aware modeling into the selective state space, enabling multi-scale temporal dependency capture; and (2) Permutation Scanning Mamba, which explicitly models dynamic cross-channel dependencies via a learnable channel-permutation scanning mechanismโboth unified within an extended state transition matrix framework. The model integrates lag-aware denoising and channel-aware diffusion in an end-to-end manner. Extensive experiments on multiple benchmark datasets demonstrate significant improvements in generated sequence fidelity, periodic consistency, and preservation of inter-channel correlation structures, establishing a novel paradigm for time-series generation.
๐ Abstract
Time series data plays a pivotal role in a wide variety of fields but faces challenges related to privacy concerns. Recently, synthesizing data via diffusion models is viewed as a promising solution. However, existing methods still struggle to capture long-range temporal dependencies and complex channel interrelations. In this research, we aim to utilize the sequence modeling capability of a State Space Model called Mamba to extend its applicability to time series data generation. We firstly analyze the core limitations in State Space Model, namely the lack of consideration for correlated temporal lag and channel permutation. Building upon the insight, we propose Lag Fusion Mamba and Permutation Scanning Mamba, which enhance the model's ability to discern significant patterns during the denoising process. Theoretical analysis reveals that both variants exhibit a unified matrix multiplication framework with the original Mamba, offering a deeper understanding of our method. Finally, we integrate two variants and introduce Diffusion Mamba for Time Series (DiM-TS), a high-quality time series generation model that better preserves the temporal periodicity and inter-channel correlations. Comprehensive experiments on public datasets demonstrate the superiority of DiM-TS in generating realistic time series while preserving diverse properties of data.