MantisV2: Closing the Zero-Shot Gap in Time Series Classification with Synthetic Data and Test-Time Strategies

📅 2026-02-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the significant performance gap between frozen encoders and fine-tuned models in zero-shot time series classification. To bridge this gap, the authors propose Mantis+, the first time series foundation model pretrained exclusively on synthetic data, accompanied by the lightweight and efficient MantisV2 architecture. Furthermore, they introduce a test-time augmentation strategy that integrates intermediate-layer representations, self-ensembling, and cross-model embeddings to substantially enhance zero-shot feature generalization. Evaluated on standard benchmarks—including UCR, UEA, HAR, and EEG—the proposed approach achieves state-of-the-art performance in zero-shot time series classification.

Technology Category

Application Category

📝 Abstract
Developing foundation models for time series classification is of high practical relevance, as such models can serve as universal feature extractors for diverse downstream tasks. Although early models such as Mantis have shown the promise of this approach, a substantial performance gap remained between frozen and fine-tuned encoders. In this work, we introduce methods that significantly strengthen zero-shot feature extraction for time series. First, we introduce Mantis+, a variant of Mantis pre-trained entirely on synthetic time series. Second, through controlled ablation studies, we refine the architecture and obtain MantisV2, an improved and more lightweight encoder. Third, we propose an enhanced test-time methodology that leverages intermediate-layer representations and refines output-token aggregation. In addition, we show that performance can be further improved via self-ensembling and cross-model embedding fusion. Extensive experiments on UCR, UEA, Human Activity Recognition (HAR) benchmarks, and EEG datasets show that MantisV2 and Mantis+ consistently outperform prior time series foundation models, achieving state-of-the-art zero-shot performance.
Problem

Research questions and friction points this paper is trying to address.

time series classification
zero-shot learning
foundation models
performance gap
feature extraction
Innovation

Methods, ideas, or system contributions that make the work stand out.

zero-shot time series classification
synthetic data pretraining
test-time adaptation
foundation model
embedding fusion
🔎 Similar Papers
No similar papers found.