Cross-RAG: Zero-Shot Retrieval-Augmented Time Series Forecasting via Cross-Attention

📅 2026-03-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Zero-shot time series forecasting often suffers from limited generalization on unseen datasets, and existing retrieval-augmented approaches frequently introduce irrelevant information. To address this, this work proposes Cross-RAG, a novel framework that incorporates a cross-attention mechanism between the query and retrieved samples to dynamically model their relevance and adaptively fuse external knowledge. This design overcomes the constraint of fixed retrieval counts and is compatible with diverse time series foundation models and retrieval strategies. Extensive experiments across multiple zero-shot scenarios demonstrate that Cross-RAG significantly improves forecasting performance, underscoring its robustness and effectiveness.

Technology Category

Application Category

📝 Abstract
Recent advances in time series foundation models (TSFMs) demonstrate strong expressive capacity through large-scale pretraining across diverse time series domains. Zero-shot time series forecasting with TSFMs, however, exhibits limited generalization to unseen datasets, which retrieval-augmented forecasting addresses by leveraging an external knowledge base. Existing approaches rely on a fixed number of retrieved samples that may introduce irrelevant information. To this end, we propose Cross-RAG, a zero-shot retrieval-augmented forecasting framework that selectively attends to query-relevant retrieved samples. Cross-RAG models input-level relevance between the query and retrieved samples via query-retrieval cross-attention, while jointly incorporating information from the query and retrieved samples. Extensive experiments demonstrate that Cross-RAG consistently improves zero-shot forecasting performance across various TSFMs and RAG methods, and additional analyses confirm its effectiveness across diverse retrieval scenarios. Code is available at https://github.com/seunghan96/cross-rag/.
Problem

Research questions and friction points this paper is trying to address.

zero-shot forecasting
time series
retrieval-augmented generation
generalization
irrelevant information
Innovation

Methods, ideas, or system contributions that make the work stand out.

Cross-Attention
Retrieval-Augmented Forecasting
Zero-Shot Time Series Forecasting
Time Series Foundation Models
Query-Relevant Retrieval
🔎 Similar Papers
No similar papers found.
Seunghan Lee
Seunghan Lee
Yonsei University
Deep LearningMachine Learning
J
Jaehoon Lee
LG AI Research, Seoul, South Korea
J
Jun Seo
LG AI Research, Seoul, South Korea
S
Sungdong Yoo
LG AI Research, Seoul, South Korea
M
Minjae Kim
LG AI Research, Seoul, South Korea
T
Tae Yoon Lim
LG AI Research, Seoul, South Korea
D
Dongwan Kang
LG AI Research, Seoul, South Korea
H
Hwanil Choi
LG AI Research, Seoul, South Korea
S
SoonYoung Lee
LG AI Research, Seoul, South Korea
W
Wonbin Ahn
LG AI Research, Seoul, South Korea