🤖 AI Summary
Existing diffusion-based time-series modeling approaches rely on fixed Gaussian priors, which inadequately capture temporal dynamic structures and thus limit generation quality. To address this, we propose TSFlow—the first conditional probabilistic forecasting model that integrates Conditional Gaussian Processes (CGPs) into the flow matching framework. TSFlow constructs data-dependent dynamic priors via optimal transport paths and introduces a novel conditional prior sampling mechanism, enabling the generative process to better reflect intrinsic temporal correlations and uncertainty. Crucially, it supports high-fidelity unconditional generation and flexible conditional forecasting without modifying the training objective. Evaluated on eight real-world datasets, TSFlow achieves significant improvements in unconditional generation metrics over state-of-the-art methods and attains either SOTA or leading performance across multiple probabilistic forecasting benchmarks.
📝 Abstract
Recent advancements in generative modeling, particularly diffusion models, have opened new directions for time series modeling, achieving state-of-the-art performance in forecasting and synthesis. However, the reliance of diffusion-based models on a simple, fixed prior complicates the generative process since the data and prior distributions differ significantly. We introduce TSFlow, a conditional flow matching (CFM) model for time series combining Gaussian processes, optimal transport paths, and data-dependent prior distributions. By incorporating (conditional) Gaussian processes, TSFlow aligns the prior distribution more closely with the temporal structure of the data, enhancing both unconditional and conditional generation. Furthermore, we propose conditional prior sampling to enable probabilistic forecasting with an unconditionally trained model. In our experimental evaluation on eight real-world datasets, we demonstrate the generative capabilities of TSFlow, producing high-quality unconditional samples. Finally, we show that both conditionally and unconditionally trained models achieve competitive results across multiple forecasting benchmarks.