🤖 AI Summary
This work addresses the limitation of current large language models in time series forecasting—namely, their lack of explicit experiential accumulation and continuous evolution capabilities. The authors propose a memory-driven forecasting framework that reframes prediction as an experience-based reasoning process. By organizing historical patterns, reasoning logic, and generalizable regularities into a hierarchical memory structure, the model learns reusable experiential knowledge during training through reasoning trajectory distillation and temporal feature induction. At inference time, a dynamic confidence-aware adaptation mechanism enables the model to perform conditional predictions grounded in retrieved memories and to evolve continuously. Extensive experiments demonstrate that the proposed method significantly outperforms existing approaches across multiple benchmark datasets, substantiating the efficacy and potential of experience-conditioned reasoning in time series forecasting.
📝 Abstract
Time series forecasting (TSF) plays a critical role in decision-making for many real-world applications. Recently, LLM-based forecasters have made promising advancements. Despite their effectiveness, existing methods often lack explicit experience accumulation and continual evolution. In this work, we propose MemCast, a learning-to-memory framework that reformulates TSF as an experience-conditioned reasoning task. Specifically, we learn experience from the training set and organize it into a hierarchical memory. This is achieved by summarizing prediction results into historical patterns, distilling inference trajectories into reasoning wisdom, and inducing extracted temporal features into general laws. Furthermore, during inference, we leverage historical patterns to guide the reasoning process and utilize reasoning wisdom to select better trajectories, while general laws serve as criteria for reflective iteration. Additionally, to enable continual evolution, we design a dynamic confidence adaptation strategy that updates the confidence of individual entries without leaking the test set distribution. Extensive experiments on multiple datasets demonstrate that MemCast consistently outperforms previous methods, validating the effectiveness of our approach. Our code is available at https://github.com/Xiaoyu-Tao/MemCast-TS.