🤖 AI Summary
This work addresses the ongoing challenge of effectively modeling the autocorrelation structures inherent in both historical and target sequences for deep time series forecasting. From the perspective of autocorrelation modeling, the paper proposes a unified taxonomy that systematically organizes existing approaches along two key dimensions: model architecture and learning objectives. By establishing a comprehensive review framework and an open-source repository, this study not only clarifies the essential technical pathways in autocorrelation-aware forecasting but also provides a structured reference to guide future research, thereby fostering more systematic progress in this domain.
📝 Abstract
Autocorrelation is a defining characteristic of time-series data, where each observation is statistically dependent on its predecessors. In the context of deep time-series forecasting, autocorrelation arises in both the input history and the label sequences, presenting two central research challenges: (1) designing neural architectures that model autocorrelation in history sequences, and (2) devising learning objectives that model autocorrelation in label sequences. Recent studies have made strides in tackling these challenges, but a systematic survey examining both aspects remains lacking. To bridge this gap, this paper provides a comprehensive review of deep time-series forecasting from the perspective of autocorrelation modeling. In contrast to existing surveys, this work makes two distinctive contributions. First, it proposes a novel taxonomy that encompasses recent literature on both model architectures and learning objectives -- whereas prior surveys neglect or inadequately discuss the latter aspect. Second, it offers a thorough analysis of the motivations, insights, and progression of the surveyed literature from a unified, autocorrelation-centric perspective, providing a holistic overview of the evolution of deep time-series forecasting. The full list of papers and resources is available at https://github.com/Master-PLC/Awesome-TSF-Papers.