π€ AI Summary
Existing methods for conditional time-series generation suffer from three key limitations: (i) poor generalizability of conditioning signals across unseen scenarios, (ii) low inference efficiency due to autoregressive sequential sampling, and (iii) redundant encoding of categorical features. This paper proposes a novel denoising diffusion probabilistic model framework that jointly models constraints and signals, introduces segment-wise parallel sampling, and employs periodic embedding βstitchingβ to achieve zero-shot conditional generalization at inference time. A cross-segment consistency constraint ensures high-fidelity generation, while categorical feature representations are significantly compressed. Experiments demonstrate that our method reduces mean squared error by up to 10Γ compared to baselines, accelerates generation by 460Γ over autoregressive models, maintains comparable accuracy, and drastically reduces input dimensionality.
π Abstract
Generating temporal data under constraints is critical for forecasting, imputation, and synthesis. These datasets often include auxiliary conditions that influence the values within the time series signal. Existing methods face three key challenges: (1) they fail to adapt to conditions at inference time; (2) they rely on sequential generation, which slows the generation speed; and (3) they inefficiently encode categorical features, leading to increased sparsity and input sizes. We propose WaveStitch, a novel method that addresses these challenges by leveraging denoising diffusion probabilistic models to efficiently generate accurate temporal data under given auxiliary constraints. WaveStitch overcomes these limitations by: (1) modeling interactions between constraints and signals to generalize to new, unseen conditions; (2) enabling the parallel synthesis of sequential segments with a novel"stitching"mechanism to enforce coherence across segments; and (3) encoding categorical features as compact periodic signals while preserving temporal patterns. Extensive evaluations across diverse datasets highlight WaveStitch's ability to generalize to unseen conditions during inference, achieving up to a 10x lower mean-squared-error compared to the state-of-the-art methods. Moreover, WaveStitch generates data up to 460x faster than autoregressive methods while maintaining comparable accuracy. By efficiently encoding categorical features, WaveStitch provides a robust and efficient solution for temporal data generation. Our code is open-sourced: https://github.com/adis98/HierarchicalTS