CoRA: Covariate-Aware Adaptation of Time Series Foundation Models

📅 2025-10-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing time-series foundation models (TSFMs) are predominantly pretrained on univariate data, limiting their capacity to effectively model multivariate forecasting tasks with heterogeneous external covariates and hindering cross-dataset generalization. To address this, we propose CoRA—a lightweight, plug-and-play framework that freezes the pretrained backbone while introducing a Granger-causality-based embedding mechanism for interpretable covariate selection. CoRA further incorporates zero-initialized conditional injection and cross-modal embedding fusion, enabling covariate adaptation without catastrophic forgetting and preserving original feature extraction capabilities. Compatible with mainstream TSFMs, CoRA requires only minimal architectural adjustments to support multimodal covariate integration. Empirically, on covariate-aware forecasting benchmarks, CoRA reduces mean squared error (MSE) by 31.1% relative to strong baselines—outperforming both full fine-tuning and few-shot adaptation methods.

Technology Category

Application Category

📝 Abstract
Time Series Foundation Models (TSFMs) have shown significant impact through their model capacity, scalability, and zero-shot generalization. However, due to the heterogeneity of inter-variate dependencies and the backbone scalability on large-scale multivariate datasets, most TSFMs are typically pre-trained on univariate time series. This limitation renders them oblivious to crucial information from diverse covariates in real-world forecasting tasks. To further enhance the performance of TSFMs, we propose a general covariate-aware adaptation (CoRA) framework for TSFMs. It leverages pre-trained backbones of foundation models while effectively incorporating exogenous covariates from various modalities, including time series, language, and images, to improve the quality of predictions. Technically, CoRA maintains the equivalence of initialization and parameter consistency during adaptation. With preserved backbones of foundation models as frozen feature extractors, the outcome embeddings from foundation models are empirically demonstrated more informative than raw data. Further, CoRA employs a novel Granger Causality Embedding (GCE) to automatically evaluate covariates regarding their causal predictability with respect to the target variate. We incorporate these weighted embeddings with a zero-initialized condition-injection mechanism, avoiding catastrophic forgetting of pre-trained foundation models and gradually integrates exogenous information. Extensive experiments show that CoRA of TSFMs surpasses state-of-the-art covariate-aware deep forecasters with full or few-shot training samples, achieving 31.1% MSE reduction on covariate-aware forecasting. Compared to other adaptation methods, CoRA exhibits strong compatibility with various advanced TSFMs and extends the scope of covariates to other modalities, presenting a practical paradigm for the application of TSFMs.
Problem

Research questions and friction points this paper is trying to address.

Enhancing Time Series Foundation Models with covariate-aware adaptation framework
Incorporating exogenous covariates from multiple modalities to improve predictions
Addressing limitations of univariate pre-training in multivariate forecasting tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages pre-trained backbones as frozen feature extractors
Employs Granger Causality Embedding to evaluate covariates
Uses zero-initialized condition-injection to prevent catastrophic forgetting
🔎 Similar Papers
No similar papers found.
Guo Qin
Guo Qin
Tsinghua University
Machine LearningDeep Learning
Z
Zhi Chen
School of Software, BNRist, Tsinghua University, Beijing 100084, China
Y
Yong Liu
School of Software, BNRist, Tsinghua University, Beijing 100084, China
Zhiyuan Shi
Zhiyuan Shi
Researcher, Onfido, London
Computer VisionDeep LearningMachine Learning
H
Haixuan Liu
School of Software, BNRist, Tsinghua University, Beijing 100084, China
Xiangdong Huang
Xiangdong Huang
School of Software, Tsinghua University
J
Jianmin Wang
School of Software, BNRist, Tsinghua University, Beijing 100084, China
Mingsheng Long
Mingsheng Long
Associate Professor, Tsinghua University
Machine learningdeep learningtransfer learningscientific machine learning