General Time-series Model for Universal Knowledge Representation of Multivariate Time-Series data

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the long-overlooked problem of heterogeneous joint frequency-domain distributions across multiple temporal granularities in multivariate time series. We propose the first general-purpose foundational model for multi-granularity time-series knowledge representation. Our key contributions are: (1) the first theoretical and empirical demonstration that multivariate time series exhibit intrinsically heterogeneous joint distributions in the frequency domain across granularities; (2) the Fourier Knowledge Attention mechanism, enabling frequency-aware knowledge modeling by explicitly encoding spectral dependencies; and (3) an autoregressive masked reconstruction pretraining paradigm that supports task-agnostic, generative unified representation learning. Integrating Fourier transforms, multi-scale representation learning, and knowledge-aware attention, our model achieves state-of-the-art performance across three canonical generative downstream tasks—long-horizon forecasting, anomaly detection, and missing value imputation—outperforming all existing methods.

Technology Category

Application Category

📝 Abstract
Universal knowledge representation is a central problem for multivariate time series(MTS) foundation models and yet remains open. This paper investigates this problem from the first principle and it makes four folds of contributions. First, a new empirical finding is revealed: time series with different time granularities (or corresponding frequency resolutions) exhibit distinct joint distributions in the frequency domain. This implies a crucial aspect of learning universal knowledge, one that has been overlooked by previous studies. Second, a novel Fourier knowledge attention mechanism is proposed to enable learning time granularity-aware representations from both the temporal and frequency domains. Third, an autoregressive blank infilling pre-training framework is incorporated to time series analysis for the first time, leading to a generative tasks agnostic pre-training strategy. To this end, we develop the General Time-series Model (GTM), a unified MTS foundation model that addresses the limitation of contemporary time series models, which often require token, pre-training, or model-level customizations for downstream tasks adaption. Fourth, extensive experiments show that GTM outperforms state-of-the-art (SOTA) methods across all generative tasks, including long-term forecasting, anomaly detection, and imputation.
Problem

Research questions and friction points this paper is trying to address.

Universal knowledge representation for MTS
Time granularity-aware frequency domain learning
Generative task-agnostic pre-training strategy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fourier knowledge attention mechanism
autoregressive blank infilling pre-training
General Time-series Model (GTM)
🔎 Similar Papers
No similar papers found.
Cheng He
Cheng He
University of Science and Technology of China, Di-Matrix Information Technology Co., Ltd, China
X
Xu Huang
University of Science and Technology of China
Gangwei Jiang
Gangwei Jiang
中国科学技术大学
machine learning
Z
Zhaoyi Li
University of Science and Technology of China
D
Defu Lian
University of Science and Technology of China
Hong Xie
Hong Xie
University of Science and Technology of China (USTC)
Data Science/MiningOnline Learning
Enhong Chen
Enhong Chen
University of Science and Technology of China
data miningrecommender systemmachine learning
X
Xijie Liang
Di-Matrix Information Technology Co., Ltd, China
Z
Zengrong Zheng
Di-Matrix Information Technology Co., Ltd, China