Are Time Series Foundation Models Susceptible to Catastrophic Forgetting?

📅 2025-10-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work presents the first systematic investigation of catastrophic forgetting in time-series foundation models (TSFMs) under continual learning settings, where sequential fine-tuning across multiple tasks leads to significant performance degradation on previously learned tasks—revealing a critical robustness deficiency. Method: We propose a knowledge retention quantification framework based on synthetically generated periodic data: we construct controllable periodic synthetic datasets and integrate them with zero-shot transfer and sequential fine-tuning paradigms to decouple and independently assess the stability–plasticity trade-off between new-task adaptation and old-knowledge retention. Contribution/Results: Empirical evaluation demonstrates that while existing TSFMs improve performance on new tasks, they suffer severe forgetting across diverse benchmarks—exposing fundamental limitations in their continual learning capability. Our work establishes a reproducible evaluation benchmark and provides essential diagnostic tools to guide the robust evolution of TSFMs.

Technology Category

Application Category

📝 Abstract
Time Series Foundation Models (TSFMs) have shown promising zero-shot generalization across diverse forecasting tasks. However, their robustness to continual adaptation remains underexplored. In this work, we investigate the extent to which TSFMs suffer from catastrophic forgetting when fine-tuned sequentially on multiple datasets. Using synthetic datasets designed with varying degrees of periodic structure, we measure the trade-off between adaptation to new data and retention of prior knowledge. Our experiments reveal that, while fine-tuning improves performance on new tasks, it often causes significant degradation on previously learned ones, illustrating a fundamental stability-plasticity dilemma.
Problem

Research questions and friction points this paper is trying to address.

Investigating catastrophic forgetting in time series foundation models
Measuring adaptation-retention trade-off during sequential fine-tuning
Analyzing stability-plasticity dilemma in continual learning scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Evaluating catastrophic forgetting in time series models
Measuring adaptation-retention trade-off with synthetic datasets
Demonstrating stability-plasticity dilemma through sequential fine-tuning
🔎 Similar Papers
No similar papers found.
N
Nouha Karaouli
Univ. Rennes, CNRS, Inria, IRISA - UMR 6074, F-35000 Rennes, France
Denis Coquenet
Denis Coquenet
Associate Professor, Rennes University
Deep LearningComputer Vision
Elisa Fromont
Elisa Fromont
Professor, Université de Rennes, France
Data MiningMachine LearningComputer VisionTime Series Analysis
M
Martial Mermillod
Univ. Grenoble Alpes, Univ. Savoie Mont Blanc, CNRS, LPNC, Grenoble, France
M
Marina Reyboz
Univ. Grenoble Alpes, CEA, LIST, 38000 Grenoble, France