🤖 AI Summary
Zero-shot time series forecasting often suffers from limited generalization on unseen datasets, and existing retrieval-augmented approaches frequently introduce irrelevant information. To address this, this work proposes Cross-RAG, a novel framework that incorporates a cross-attention mechanism between the query and retrieved samples to dynamically model their relevance and adaptively fuse external knowledge. This design overcomes the constraint of fixed retrieval counts and is compatible with diverse time series foundation models and retrieval strategies. Extensive experiments across multiple zero-shot scenarios demonstrate that Cross-RAG significantly improves forecasting performance, underscoring its robustness and effectiveness.
📝 Abstract
Recent advances in time series foundation models (TSFMs) demonstrate strong expressive capacity through large-scale pretraining across diverse time series domains. Zero-shot time series forecasting with TSFMs, however, exhibits limited generalization to unseen datasets, which retrieval-augmented forecasting addresses by leveraging an external knowledge base. Existing approaches rely on a fixed number of retrieved samples that may introduce irrelevant information. To this end, we propose Cross-RAG, a zero-shot retrieval-augmented forecasting framework that selectively attends to query-relevant retrieved samples. Cross-RAG models input-level relevance between the query and retrieved samples via query-retrieval cross-attention, while jointly incorporating information from the query and retrieved samples. Extensive experiments demonstrate that Cross-RAG consistently improves zero-shot forecasting performance across various TSFMs and RAG methods, and additional analyses confirm its effectiveness across diverse retrieval scenarios. Code is available at https://github.com/seunghan96/cross-rag/.