TFMAdapter: Lightweight Instance-Level Adaptation of Foundation Models for Forecasting with Covariates

๐Ÿ“… 2025-09-17
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing time-series foundation models (TSFMs) struggle to effectively incorporate domain-specific exogenous covariates known in advance, due to the absence of suitable inductive biases. This work proposes TFMAdapterโ€”a lightweight, fine-tuning-free, instance-level adapter enabling the first non-parametric fusion of such covariates. It employs a two-stage strategy: first, a simple regression model generates covariate-driven pseudo-forecasts; second, a Gaussian process regressor adaptively fuses these pseudo-forecasts, the base TSFMโ€™s output, and the raw covariates to refine predictions. Evaluated across multiple real-world benchmarks, TFMAdapter improves over the underlying TSFM by 24โ€“27% in forecasting accuracy, substantially outperforming supervised baselines. Crucially, it achieves this with minimal data and computational overhead, striking an effective balance between generalization and domain adaptability.

Technology Category

Application Category

๐Ÿ“ Abstract
Time Series Foundation Models (TSFMs) have recently achieved state-of-the-art performance in univariate forecasting on new time series simply by conditioned on a brief history of past values. Their success demonstrates that large-scale pretraining across diverse domains can acquire the inductive bias to generalize from temporal patterns in a brief history. However, most TSFMs are unable to leverage covariates -- future-available exogenous variables critical for accurate forecasting in many applications -- due to their domain-specific nature and the lack of associated inductive bias. We propose TFMAdapter, a lightweight, instance-level adapter that augments TSFMs with covariate information without fine-tuning. Instead of retraining, TFMAdapter operates on the limited history provided during a single model call, learning a non-parametric cascade that combines covariates with univariate TSFM forecasts. However, such learning would require univariate forecasts at all steps in the history, requiring too many calls to the TSFM. To enable training on the full historical context while limiting TSFM invocations, TFMAdapter uses a two-stage method: (1) generating pseudo-forecasts with a simple regression model, and (2) training a Gaussian Process regressor to refine predictions using both pseudo- and TSFM forecasts alongside covariates. Extensive experiments on real-world datasets demonstrate that TFMAdapter consistently outperforms both foundation models and supervised baselines, achieving a 24-27% improvement over base foundation models with minimal data and computational overhead. Our results highlight the potential of lightweight adapters to bridge the gap between generic foundation models and domain-specific forecasting needs.
Problem

Research questions and friction points this paper is trying to address.

Enabling time series foundation models to incorporate future-available exogenous covariates
Overcoming computational limitations of requiring multiple TSFM calls for historical context
Bridging the gap between generic foundation models and domain-specific forecasting needs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lightweight adapter for covariate integration
Two-stage method with pseudo-forecasts
Non-parametric cascade combining covariates with forecasts
๐Ÿ”Ž Similar Papers
No similar papers found.
A
Afrin Dange
Indian Institute of Technology Bombay, Centre for Machine Intelligence and Data Science
Sunita Sarawagi
Sunita Sarawagi
Indian Institute of Technology Bombay, Mumbai, India.
AdaptationCausal InferenceIn-context learningSemantic parsing