Diff-MN: Diffusion Parameterized MoE-NCDE for Continuous Time Series Generation with Irregular Observations

๐Ÿ“… 2026-01-20
๐Ÿ“ˆ Citations: 1
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenge of generating high-resolution continuous time series from irregular and sparse observationsโ€”a setting where existing methods, which typically assume regularly sampled data at fixed resolution, struggle to perform effectively. To this end, we propose the MN-TSG framework, which innovatively integrates Mixture-of-Experts (MoE) with Neural Controlled Differential Equations (NCDEs) to form a MoE-NCDE architecture featuring dynamically parameterized expert functions and a decoupled optimization structure. This design enables sample-adaptive expert configuration, facilitating high-fidelity continuous-time modeling. The framework supports end-to-end training and jointly models the joint distribution of observations and timestamps. Extensive experiments across ten real-world and synthetic datasets demonstrate that MN-TSG consistently outperforms state-of-the-art baselines in both irregular-to-regular and irregular-to-continuous generation tasks, achieving superior performance.

Technology Category

Application Category

๐Ÿ“ Abstract
Time series generation (TSG) is widely used across domains, yet most existing methods assume regular sampling and fixed output resolutions. These assumptions are often violated in practice, where observations are irregular and sparse, while downstream applications require continuous and high-resolution TS. Although Neural Controlled Differential Equation (NCDE) is promising for modeling irregular TS, it is constrained by a single dynamics function, tightly coupled optimization, and limited ability to adapt learned dynamics to newly generated samples from the generative model. We propose Diff-MN, a continuous TSG framework that enhances NCDE with a Mixture-of-Experts (MoE) dynamics function and a decoupled architectural design for dynamics-focused training. To further enable NCDE to generalize to newly generated samples, Diff-MN employs a diffusion model to parameterize the NCDE temporal dynamics parameters (MoE weights), i.e., jointly learn the distribution of TS data and MoE weights. This design allows sample-specific NCDE parameters to be generated for continuous TS generation. Experiments on ten public and synthetic datasets demonstrate that Diff-MN consistently outperforms strong baselines on both irregular-to-regular and irregular-to-continuous TSG tasks. The code is available at the link https://github.com/microsoft/TimeCraft/tree/main/Diff-MN.
Problem

Research questions and friction points this paper is trying to address.

time series generation
irregular observations
continuous generation
neural controlled differential equations
mixture-of-experts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixture-of-Experts
Neural Controlled Differential Equations
Irregular Time Series
Continuous Time Series Generation
Dynamic Expert Configuration
๐Ÿ”Ž Similar Papers
No similar papers found.