One-Embedding-Fits-All: Efficient Zero-Shot Time Series Forecasting by a Model Zoo

📅 2025-09-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Single time-series foundation models (TSFMs) struggle to achieve universal optimality across diverse forecasting tasks. Method: This paper proposes ZooCast—a framework that constructs a “zoo” of TSFMs and introduces the novel One-Embedding-Fits-All paradigm, mapping heterogeneous models into a unified embedding space. This enables zero-shot dynamic model selection and lightweight ensemble inference via model embedding learning and similarity-based matching. Contribution/Results: ZooCast supports efficient retrieval, seamless incremental model integration, and cross-task generalization. Evaluated on the GIFT-Eval benchmark, it significantly outperforms individual TSFMs and state-of-the-art ensemble methods—achieving consistent accuracy gains while preserving single-model inference latency and enabling scalable deployment.

Technology Category

Application Category

📝 Abstract
The proliferation of Time Series Foundation Models (TSFMs) has significantly advanced zero-shot forecasting, enabling predictions for unseen time series without task-specific fine-tuning. Extensive research has confirmed that no single TSFM excels universally, as different models exhibit preferences for distinct temporal patterns. This diversity suggests an opportunity: how to take advantage of the complementary abilities of TSFMs. To this end, we propose ZooCast, which characterizes each model's distinct forecasting strengths. ZooCast can intelligently assemble current TSFMs into a model zoo that dynamically selects optimal models for different forecasting tasks. Our key innovation lies in the One-Embedding-Fits-All paradigm that constructs a unified representation space where each model in the zoo is represented by a single embedding, enabling efficient similarity matching for all tasks. Experiments demonstrate ZooCast's strong performance on the GIFT-Eval zero-shot forecasting benchmark while maintaining the efficiency of a single TSFM. In real-world scenarios with sequential model releases, the framework seamlessly adds new models for progressive accuracy gains with negligible overhead.
Problem

Research questions and friction points this paper is trying to address.

Leveraging diverse Time Series Foundation Models' complementary strengths
Dynamically selecting optimal models for zero-shot forecasting tasks
Creating unified embedding space for efficient model-task matching
Innovation

Methods, ideas, or system contributions that make the work stand out.

Model zoo with dynamic model selection
One-Embedding-Fits-All unified representation space
Efficient similarity matching across forecasting tasks
🔎 Similar Papers
No similar papers found.
H
Hao-Nan Shi
School of Artificial Intelligence, Nanjing University
Ting-Ji Huang
Ting-Ji Huang
Nanjing University
L
Lu Han
School of Artificial Intelligence, Nanjing University
De-Chuan Zhan
De-Chuan Zhan
Nanjing University, China
Machine LearningData Mining
Han-Jia Ye
Han-Jia Ye
Nanjing University
Machine LearningData MiningMetric LearningMeta-Learning