๐ค AI Summary
This work addresses the challenge of generating high-resolution continuous time series from irregular and sparse observationsโa setting where existing methods, which typically assume regularly sampled data at fixed resolution, struggle to perform effectively. To this end, we propose the MN-TSG framework, which innovatively integrates Mixture-of-Experts (MoE) with Neural Controlled Differential Equations (NCDEs) to form a MoE-NCDE architecture featuring dynamically parameterized expert functions and a decoupled optimization structure. This design enables sample-adaptive expert configuration, facilitating high-fidelity continuous-time modeling. The framework supports end-to-end training and jointly models the joint distribution of observations and timestamps. Extensive experiments across ten real-world and synthetic datasets demonstrate that MN-TSG consistently outperforms state-of-the-art baselines in both irregular-to-regular and irregular-to-continuous generation tasks, achieving superior performance.
๐ Abstract
Time series generation (TSG) is widely used across domains, yet most existing methods assume regular sampling and fixed output resolutions. These assumptions are often violated in practice, where observations are irregular and sparse, while downstream applications require continuous and high-resolution TS. Although Neural Controlled Differential Equation (NCDE) is promising for modeling irregular TS, it is constrained by a single dynamics function, tightly coupled optimization, and limited ability to adapt learned dynamics to newly generated samples from the generative model. We propose Diff-MN, a continuous TSG framework that enhances NCDE with a Mixture-of-Experts (MoE) dynamics function and a decoupled architectural design for dynamics-focused training. To further enable NCDE to generalize to newly generated samples, Diff-MN employs a diffusion model to parameterize the NCDE temporal dynamics parameters (MoE weights), i.e., jointly learn the distribution of TS data and MoE weights. This design allows sample-specific NCDE parameters to be generated for continuous TS generation. Experiments on ten public and synthetic datasets demonstrate that Diff-MN consistently outperforms strong baselines on both irregular-to-regular and irregular-to-continuous TSG tasks. The code is available at the link https://github.com/microsoft/TimeCraft/tree/main/Diff-MN.