Kairos: Towards Adaptive and Generalizable Time Series Foundation Models

📅 2025-09-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current time-series foundation models (TSFMs) face two key bottlenecks in zero-shot forecasting: (1) fixed-granularity tokenization fails to adapt to dynamically varying information density across sequences; and (2) generic positional encodings inadequately capture diverse periodicities and trends. To address these, we propose an adaptive TSFM framework featuring: (1) a dynamic tokenization module that automatically adjusts block size based on local information density; (2) instance-adaptive positional encoding, which learns sequence-specific temporal structure representations; and (3) a multi-scale block prediction pretraining paradigm. Pretrained on the large-scale PreSTS dataset, our model achieves state-of-the-art performance on GIFT-Eval and Time-Series-Library benchmarks—outperforming prior methods with significantly fewer parameters. It delivers substantial gains in zero-shot forecasting accuracy and, for the first time, enables personalized, scale-adaptive modeling of heterogeneous time series.

Technology Category

Application Category

📝 Abstract
Time series foundation models (TSFMs) have emerged as a powerful paradigm for time series analysis, driven by large-scale pretraining on diverse data corpora. However, time series inherently exhibit heterogeneous information density over time, influenced by system states and signal complexity, presenting significant modeling challenges especially in a zero-shot scenario. Current TSFMs rely on non-adaptive processing pipelines that fail to capture this dynamic nature. For example, common tokenization strategies such as fixed-size patching enforce rigid observational granularity, limiting their ability to adapt to varying information densities. Similarly, conventional positional encodings impose a uniform temporal scale, making it difficult to model diverse periodicities and trends across series. To overcome these limitations, we propose Kairos, a flexible TSFM framework that integrates a dynamic patching tokenizer and an instance-adaptive positional embedding. Kairos adaptively selects tokenization granularity and tailors positional encodings to the unique characteristics of each time series instance. Trained on a large-scale Predictability-Stratified Time Series (PreSTS) corpus comprising over 300 billion time points and adopting a multi-patch prediction strategy in the inference stage, Kairos achieves superior performance with much fewer parameters on two common zero-shot benchmarks, GIFT-Eval and the Time-Series-Library benchmark, consistently outperforming established methods across diverse tasks. The project page is at https://foundation-model-research.github.io/Kairos .
Problem

Research questions and friction points this paper is trying to address.

Addresses heterogeneous information density in time series data
Overcomes rigid tokenization strategies in foundation models
Solves uniform temporal scale limitations in positional encodings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic patching tokenizer adapts tokenization granularity
Instance-adaptive positional embedding customizes temporal representations
Multi-patch prediction strategy enhances inference performance
🔎 Similar Papers
No similar papers found.
Kun Feng
Kun Feng
Illinois Institute of Technology
S
Shaocheng Lan
School of Information Science and Technology, ShanghaiTech University, Shanghai, China
Y
Yuchen Fang
Ant Group, Shanghai, China
W
Wenchao He
School of Information Science and Technology, ShanghaiTech University, Shanghai, China
Lintao Ma
Lintao Ma
Ant Group
bayesian learningtime series analysisgenerative models
X
Xingyu Lu
Ant Group, Shanghai, China
Kan Ren
Kan Ren
Assistant Professor, ShanghaiTech University
Machine LearningData MiningLarge Language ModelFoundation Model