Conditionally Whitened Generative Models for Probabilistic Time Series Forecasting

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Probabilistic forecasting of multivariate time series faces three core challenges: non-stationarity, difficulty in modeling inter-variable dependencies, and distributional shift. To address these, we propose Conditional Whitening Generative Framework (CW-Gen), the first approach to incorporate conditional whitening into diffusion and flow-matching models. CW-Gen leverages structural priors—including dynamic mean estimation and sliding-window covariance—to enhance generative fidelity. We theoretically derive the optimal terminal distribution condition and design a Joint Mean-Covariance Estimator (JMCE) for simultaneous learning of both components. Based on this, we instantiate two novel architectures: CW-Diff and CW-Flow. Extensive experiments across five real-world datasets demonstrate that CW-Gen consistently outperforms six state-of-the-art generative baselines, achieving superior accuracy in capturing non-stationary dynamics and cross-variable dependencies while effectively mitigating distributional shift.

Technology Category

Application Category

📝 Abstract
Probabilistic forecasting of multivariate time series is challenging due to non-stationarity, inter-variable dependencies, and distribution shifts. While recent diffusion and flow matching models have shown promise, they often ignore informative priors such as conditional means and covariances. In this work, we propose Conditionally Whitened Generative Models (CW-Gen), a framework that incorporates prior information through conditional whitening. Theoretically, we establish sufficient conditions under which replacing the traditional terminal distribution of diffusion models, namely the standard multivariate normal, with a multivariate normal distribution parameterized by estimators of the conditional mean and covariance improves sample quality. Guided by this analysis, we design a novel Joint Mean-Covariance Estimator (JMCE) that simultaneously learns the conditional mean and sliding-window covariance. Building on JMCE, we introduce Conditionally Whitened Diffusion Models (CW-Diff) and extend them to Conditionally Whitened Flow Matching (CW-Flow). Experiments on five real-world datasets with six state-of-the-art generative models demonstrate that CW-Gen consistently enhances predictive performance, capturing non-stationary dynamics and inter-variable correlations more effectively than prior-free approaches. Empirical results further demonstrate that CW-Gen can effectively mitigate the effects of distribution shift.
Problem

Research questions and friction points this paper is trying to address.

Addressing non-stationarity and distribution shifts in multivariate time series forecasting
Incorporating conditional means and covariances as informative priors in generative models
Improving sample quality by replacing standard normal terminal distributions with parameterized ones
Innovation

Methods, ideas, or system contributions that make the work stand out.

Conditional whitening framework incorporates prior information
Joint Mean-Covariance Estimator learns conditional statistics simultaneously
Replaces standard normal with parameterized multivariate normal distribution
Yanfeng Yang
Yanfeng Yang
The Institute of Statistical Mathematics
Generative modelCausal inferenceApplied statistics
Siwei Chen
Siwei Chen
National University of Singapore
roboticsplanningimitation learningreinforcement learning
P
Pingping Hu
East China Normal University, Shanghai, China
Z
Zhaotong Shen
East China Normal University, Shanghai, China
Y
Yingjie Zhang
East China Normal University, Shanghai, China
Z
Zhuoran Sun
East China Normal University, Shanghai, China
S
Shuai Li
East China Normal University, Shanghai, China
Z
Ziqi Chen
East China Normal University, Shanghai, China
Kenji Fukumizu
Kenji Fukumizu
The Institute of Statistical Mathematics
Machine learningstatistics