🤖 AI Summary
Existing patch-based time-series forecasting methods enhance periodic pattern modeling but suffer from inefficiency due to excessive parameter counts and high computational overhead. This inefficiency stems fundamentally from local patch-wise processing, which disrupts global phase relationships across the sequence.
Method: We propose a novel “phase modeling paradigm” that replaces conventional patch tokens with compact phase embeddings, integrated with a lightweight routing mechanism and a staged prediction architecture to enable efficient cross-phase interaction.
Contribution/Results: Our approach achieves global periodic pattern modeling with only ~1K parameters—orders of magnitude fewer than prior patch-based models—while attaining state-of-the-art (SOTA) performance across multiple benchmark datasets. Notably, it significantly outperforms comparably parameterized models on large-scale and complex sequences, effectively breaking the long-standing efficiency–accuracy trade-off in time-series forecasting.
📝 Abstract
Periodicity is a fundamental characteristic of time series data and has long played a central role in forecasting. Recent deep learning methods strengthen the exploitation of periodicity by treating patches as basic tokens, thereby improving predictive effectiveness. However, their efficiency remains a bottleneck due to large parameter counts and heavy computational costs. This paper provides, for the first time, a clear explanation of why patch-level processing is inherently inefficient, supported by strong evidence from real-world data. To address these limitations, we introduce a phase perspective for modeling periodicity and present an efficient yet effective solution, PhaseFormer. PhaseFormer features phase-wise prediction through compact phase embeddings and efficient cross-phase interaction enabled by a lightweight routing mechanism. Extensive experiments demonstrate that PhaseFormer achieves state-of-the-art performance with around 1k parameters, consistently across benchmark datasets. Notably, it excels on large-scale and complex datasets, where models with comparable efficiency often struggle. This work marks a significant step toward truly efficient and effective time series forecasting. Code is available at this repository: https://github.com/neumyor/PhaseFormer_TSL