DiM-TS: Bridge the Gap between Selective State Space Models and Time Series for Generative Modeling

๐Ÿ“… 2025-11-23
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing time-series generation methods struggle to model long-range temporal dependencies and complex inter-channel correlations. To address this, we propose a diffusion-based generative framework built upon the Mamba architecture. Our key contributions are: (1) Lag Fusion Mamba, the first approach to embed lag-aware modeling into the selective state space, enabling multi-scale temporal dependency capture; and (2) Permutation Scanning Mamba, which explicitly models dynamic cross-channel dependencies via a learnable channel-permutation scanning mechanismโ€”both unified within an extended state transition matrix framework. The model integrates lag-aware denoising and channel-aware diffusion in an end-to-end manner. Extensive experiments on multiple benchmark datasets demonstrate significant improvements in generated sequence fidelity, periodic consistency, and preservation of inter-channel correlation structures, establishing a novel paradigm for time-series generation.

Technology Category

Application Category

๐Ÿ“ Abstract
Time series data plays a pivotal role in a wide variety of fields but faces challenges related to privacy concerns. Recently, synthesizing data via diffusion models is viewed as a promising solution. However, existing methods still struggle to capture long-range temporal dependencies and complex channel interrelations. In this research, we aim to utilize the sequence modeling capability of a State Space Model called Mamba to extend its applicability to time series data generation. We firstly analyze the core limitations in State Space Model, namely the lack of consideration for correlated temporal lag and channel permutation. Building upon the insight, we propose Lag Fusion Mamba and Permutation Scanning Mamba, which enhance the model's ability to discern significant patterns during the denoising process. Theoretical analysis reveals that both variants exhibit a unified matrix multiplication framework with the original Mamba, offering a deeper understanding of our method. Finally, we integrate two variants and introduce Diffusion Mamba for Time Series (DiM-TS), a high-quality time series generation model that better preserves the temporal periodicity and inter-channel correlations. Comprehensive experiments on public datasets demonstrate the superiority of DiM-TS in generating realistic time series while preserving diverse properties of data.
Problem

Research questions and friction points this paper is trying to address.

Addresses challenges in capturing long-range temporal dependencies in time series
Overcomes limitations in modeling complex channel interrelations for data generation
Solves privacy concerns through improved synthetic time series generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using Mamba State Space Model for time series generation
Introducing Lag Fusion and Permutation Scanning Mamba variants
Creating unified diffusion framework preserving temporal dependencies
๐Ÿ”Ž Similar Papers
No similar papers found.
Zihao Yao
Zihao Yao
University of Sydney
Graph Neural NetworksDeep LearningBlockchainAnomaly Detection
J
Jiankai Zuo
The Key Laboratory of Embedded System and Service Computing, Ministry of Education, Tongji University, Shanghai 200092, China
Y
Yaying Zhang
The Key Laboratory of Embedded System and Service Computing, Ministry of Education, Tongji University, Shanghai 200092, China