Augur: Modeling Covariate Causal Associations in Time Series via Large Language Models

📅 2025-10-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing LLM-based time series forecasting methods suffer from architectural marginalization, coarse-grained prompting, and poor interpretability. To address these limitations, this paper proposes Augur—the first fully LLM-driven forecasting framework that deeply integrates large language models’ causal reasoning capabilities. Augur employs a teacher–student two-stage architecture: the teacher model leverages heuristic search and pairwise causal discovery to generate structured textual prompts, enabling LLMs to construct directed causal graphs; the student model performs interpretable, traceable predictions grounded in high-confidence causal relationships and supports zero-shot generalization. Evaluated against 25 baseline models across multiple real-world datasets, Augur achieves statistically significant improvements in both forecasting accuracy and model transparency. Crucially, it is the first approach to explicitly model and exploit covariate causal relationships within LLM-based time series forecasting, thereby advancing the frontier of causal-aware, interpretable LLM applications in temporal modeling.

Technology Category

Application Category

📝 Abstract
Large language models (LLM) have emerged as a promising avenue for time series forecasting, offering the potential to integrate multimodal data. However, existing LLM-based approaches face notable limitations-such as marginalized role in model architectures, reliance on coarse statistical text prompts, and lack of interpretability. In this work, we introduce Augur, a fully LLM driven time series forecasting framework that exploits LLM causal reasoning to discover and use directed causal associations among covariates. Augur uses a two stage teacher student architecture where a powerful teacher LLM infers a directed causal graph from time series using heuristic search together with pairwise causality testing. A lightweight student agent then refines the graph and fine tune on high confidence causal associations that are encoded as rich textual prompts to perform forecasting. This design improves predictive accuracy while yielding transparent, traceable reasoning about variable interactions. Extensive experiments on real-world datasets with 25 baselines demonstrate that Augur achieves competitive performance and robust zero-shot generalization.
Problem

Research questions and friction points this paper is trying to address.

Discovering directed causal associations among time series covariates
Improving interpretability in LLM-based time series forecasting
Enhancing predictive accuracy through causal reasoning prompts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses teacher-student architecture for causal discovery
Employs heuristic search with pairwise causality testing
Encodes high-confidence causal associations as prompts
🔎 Similar Papers
No similar papers found.