Modèles de Fondation et Ajustement : Vers une Nouvelle Génération de Modèles pour la Prévision des Séries Temporelles

📅 2025-11-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the weak zero-shot generalization capability in long-horizon time series forecasting, this paper proposes a foundation model framework for time series prediction. Inspired by large language model paradigms, it employs self-supervised pretraining on large-scale heterogeneous time series data to learn general-purpose temporal representations. Crucially, it innovatively unifies point forecasting and probabilistic forecasting within a single modeling objective and introduces a lightweight fine-tuning strategy for downstream adaptation. This approach transcends conventional task-specific architectures, significantly enhancing zero-shot forecasting performance on unseen datasets. Experiments demonstrate that the fine-tuned model achieves an average 18.7% reduction in MAE on long-horizon forecasting tasks. Moreover, it exhibits strong cross-domain adaptability and practical utility across multi-source, multi-frequency, and multi-domain time series. The work establishes a novel paradigm for time series foundation model research.

Technology Category

Application Category

📝 Abstract
Inspired by recent advances in large language models, foundation models have been developed for zero-shot time series forecasting, enabling prediction on datasets unseen during pretraining. These large-scale models, trained on vast collections of time series, learn generalizable representations for both point and probabilistic forecasting, reducing the need for task-specific architectures and manual tuning. In this work, we review the main architectures, pretraining strategies, and optimization methods used in such models, and study the effect of fine-tuning after pretraining to enhance their performance on specific datasets. Our empirical results show that fine-tuning generally improves zero-shot forecasting capabilities, especially for long-term horizons.
Problem

Research questions and friction points this paper is trying to address.

Developing foundation models for zero-shot time series forecasting
Studying fine-tuning effects to improve forecasting on specific datasets
Reducing need for task-specific architectures and manual tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Foundation models enable zero-shot time series forecasting
Models learn generalizable representations from vast datasets
Fine-tuning enhances zero-shot forecasting for long-term horizons
🔎 Similar Papers
No similar papers found.
M
Morad Laglil
IMAG 700 Av. Centrale 38058 Saint Martin d’Hères, France
Emilie Devijver
Emilie Devijver
CNRS
apprentissage statistiquecausalité
Eric Gaussier
Eric Gaussier
Professor Univ. Grenoble Alpes
NLP/computational linguisticsinformation retrievalcausalitymachine learning
B
Bertrand Pracca
SA VOYE SASU - Saint-Etienne