CoGenCast: A Coupled Autoregressive-Flow Generative Framework for Time Series Forecasting

πŸ“… 2026-02-03
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge that existing methods struggle to jointly model semantic context and continuous stochastic dynamics in time series. To this end, we propose a hybrid generative framework that reconfigures the attention topology of pretrained decoder-only large language models to support an encoder-decoder forecasting architecture, while integrating a flow-matching mechanism to explicitly capture the continuous stochastic evolution process. This approach enables unified cross-domain training and multimodal prediction, achieving substantial performance gains over current baselines across multiple benchmark datasets. The results demonstrate superior forecasting accuracy and strong generalization capabilities, highlighting the effectiveness of our joint modeling of semantic and dynamic components.

Technology Category

Application Category

πŸ“ Abstract
Time series forecasting can be viewed as a generative problem that requires both semantic understanding over contextual conditions and stochastic modeling of continuous temporal dynamics. Existing approaches typically rely on either autoregressive large language models (LLMs) for semantic context modeling or diffusion-like models for continuous probabilistic generation. However, neither method alone can adequately model both aspects simultaneously. In this work, we propose CoGenCast, a hybrid generative framework that couples pre-trained LLMs with flow-matching mechanism for effective time series forecasting. Specifically, we reconfigure pre-trained decoder-only LLMs into a native forecasting encoder-decoder backbone by modifying only the attention topology, enabling bidirectional context encoding and causal representation generation. Building on this, a flow-matching mechanism is further integrated to model temporal evolution, capturing continuous stochastic dynamics conditioned on the autoregressively generated representation. Notably, CoGenCast naturally supports multimodal forecasting and cross-domain unified training. Extensive experiments on multiple benchmarks show that CoGenCast consistently outperforms previous compared baselines. Code is available at https://github.com/liuyaguo/_CoGenCast.
Problem

Research questions and friction points this paper is trying to address.

time series forecasting
semantic context modeling
stochastic dynamics
autoregressive models
generative modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

autoregressive LLM
flow-matching
time series forecasting
hybrid generative framework
attention topology
Y
Yaguo Liu
State Key Laboratory of Cognitive Intelligence, University of Science and Technology of China, Hefei, China
M
Mingyue Cheng
State Key Laboratory of Cognitive Intelligence, University of Science and Technology of China, Hefei, China
D
Daoyu Wang
State Key Laboratory of Cognitive Intelligence, University of Science and Technology of China, Hefei, China
Xiaoyu Tao
Xiaoyu Tao
δΈ­ε›½η§‘ε­¦ζŠ€ζœ―ε€§ε­¦
ζ—Άι—΄εΊεˆ—εˆ†ζž
Qi Liu
Qi Liu
University of Science and Technology of China
Data MiningEducational Big DataRecommender SystemsSocial Network Analysis