TimeGMM: Single-Pass Probabilistic Forecasting via Adaptive Gaussian Mixture Models with Reversible Normalization

📅 2026-01-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes TimeGMM, a novel framework for probabilistic time series forecasting that overcomes limitations of existing methods—such as reliance on computationally expensive sampling or restrictive parametric assumptions—by efficiently modeling complex, arbitrary-shaped predictive distributions through a Gaussian Mixture Model (GMM) in a single forward pass. To address temporal–probabilistic distribution shifts, TimeGMM introduces GMM-adaptive Reversible Instance Normalization (GRIN), which dynamically corrects such misalignments. The framework further incorporates a Conditional Temporal-Probabilistic Decoder (CTPD) and a dedicated Temporal Encoder (TE) to jointly learn temporal dependencies and GMM parameters. Experimental results demonstrate that TimeGMM achieves significant improvements over state-of-the-art approaches, with relative gains of up to 22.48% in CRPS and 21.23% in NMAE.

Technology Category

Application Category

📝 Abstract
Probabilistic time series forecasting is crucial for quantifying future uncertainty, with significant applications in fields such as energy and finance. However, existing methods often rely on computationally expensive sampling or restrictive parametric assumptions to characterize future distributions, which limits predictive performance and introduces distributional mismatch. To address these challenges, this paper presents TimeGMM, a novel probabilistic forecasting framework based on Gaussian Mixture Models (GMM) that captures complex future distributions in a single forward pass. A key component is GMM-adapted Reversible Instance Normalization (GRIN), a novel module designed to dynamically adapt to temporal-probabilistic distribution shifts. The framework integrates a dedicated Temporal Encoder (TE-Module) with a Conditional Temporal-Probabilistic Decoder (CTPD-Module) to jointly capture temporal dependencies and mixture distribution parameters. Extensive experiments demonstrate that TimeGMM consistently outperforms state-of-the-art methods, achieving maximum improvements of 22.48\% in CRPS and 21.23\% in NMAE.
Problem

Research questions and friction points this paper is trying to address.

probabilistic forecasting
time series
distribution mismatch
Gaussian Mixture Models
temporal dependencies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian Mixture Models
Reversible Instance Normalization
Probabilistic Forecasting
Single-Pass Inference
Temporal-Probabilistic Modeling
🔎 Similar Papers
No similar papers found.
Lei Liu
Lei Liu
Anhui University of Science & Technology
CV
T
Tengyuan Liu
University of Science and Technology of China, Hefei, 230026, China
H
Hongwei Zhao
University of Science and Technology of China, Hefei, 230026, China
Jiahui Huang
Jiahui Huang
NVIDIA
3D Computer VisionGraphics
R
Ruibo Guo
University of Science and Technology of China, Hefei, 230026, China
B
Bin Li
University of Science and Technology of China, Hefei, 230026, China