LightGTS: A Lightweight General Time Series Forecasting Model

📅 2025-06-06
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and the trade-off between generalization and efficiency in generic time-series forecasting models under resource-constrained settings, this paper proposes a lightweight periodicity-aware forecasting framework. Methodologically, it deeply integrates inherent time-series periodicity as an inductive bias: (i) a novel periodic tokenization mechanism explicitly captures multi-scale periodic structures; (ii) a periodic parallel decoding strategy enhances temporal modeling efficiency; and (iii) a lightweight Transformer architecture is combined with multi-source pretraining. Evaluated on nine real-world benchmarks, the method achieves state-of-the-art performance under both zero-shot and full-finetuning settings. Moreover, it accelerates inference by 3.2–8.7× over leading time-series foundation models. This work significantly advances the development of lightweight, highly generalizable time-series models.

Technology Category

Application Category

📝 Abstract
Existing works on general time series forecasting build foundation models with heavy model parameters through large-scale multi-source pre-training. These models achieve superior generalization ability across various datasets at the cost of significant computational burdens and limitations in resource-constrained scenarios. This paper introduces LightGTS, a lightweight general time series forecasting model designed from the perspective of consistent periodical modeling. To handle diverse scales and intrinsic periods in multi-source pre-training, we introduce Periodical Tokenization, which extracts consistent periodic patterns across different datasets with varying scales. To better utilize the periodicity in the decoding process, we further introduce Periodical Parallel Decoding, which leverages historical tokens to improve forecasting. Based on the two techniques above which fully leverage the inductive bias of periods inherent in time series, LightGTS uses a lightweight model to achieve outstanding performance on general time series forecasting. It achieves state-of-the-art forecasting performance on 9 real-world benchmarks in both zero-shot and full-shot settings with much better efficiency compared with existing time series foundation models.
Problem

Research questions and friction points this paper is trying to address.

Lightweight model for general time series forecasting
Handling diverse scales and intrinsic periods efficiently
Improving forecasting with consistent periodic modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lightweight model for time series forecasting
Periodical Tokenization for consistent patterns
Periodical Parallel Decoding improves forecasting
🔎 Similar Papers
No similar papers found.
Yihang Wang
Yihang Wang
Case Western Reserve University
biophysicsstatistical mechanicsmachine learning
Y
Yuying Qiu
East China Normal University, Shanghai, China
P
Peng Chen
East China Normal University, Shanghai, China
Y
Yang Shu
East China Normal University, Shanghai, China
Zhongwen Rao
Zhongwen Rao
Noah's Ark Lab, Huawei
Time SeriesSpatial-Temporal
Lujia Pan
Lujia Pan
Noah's Ark Lab, Huawei
Anomaly dectionTime seriesRepresentation learning
B
Bin Yang
East China Normal University, Shanghai, China
Chenjuan Guo
Chenjuan Guo
Professor, East China Normal University
Data AnalyticsMachine Learning