Probabilistic Transformers for Joint Modeling of Global Weather Dynamics and Decision-Centric Variables

📅 2026-01-07
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes GEM-2, a probabilistic Transformer with approximately 275 million parameters, which jointly learns global atmospheric dynamics and decision-relevant weather functionals—such as extremes and cumulative quantities—in a single lightweight architecture. Unlike conventional numerical weather prediction models that do not directly model these functionals, leading to suboptimal post-processing and structural biases, GEM-2 is end-to-end optimized using the Continuous Ranked Probability Score (CRPS) loss without requiring multi-stage fine-tuning or diffusion-based refinement. GEM-2 outperforms operational numerical forecasting systems across multiple evaluation metrics, matches the performance of more complex machine learning pipelines, and consistently converges to climatological baselines at subseasonal-to-seasonal timescales, thereby substantially enhancing the economic value of forecasts.

Technology Category

Application Category

📝 Abstract
Weather forecasts sit upstream of high-stakes decisions in domains such as grid operations, aviation, agriculture, and emergency response. Yet forecast users often face a difficult trade-off. Many decision-relevant targets are functionals of the atmospheric state variables, such as extrema, accumulations, and threshold exceedances, rather than state variables themselves. As a result, users must estimate these targets via post-processing, which can be suboptimal and can introduce structural bias. The core issue is that decisions depend on distributions over these functionals that the model is not trained to learn directly. In this work, we introduce GEM-2, a probabilistic transformer that jointly learns global atmospheric dynamics alongside a suite of variables that users directly act upon. Using this training recipe, we show that a lightweight (~275M params) and computationally efficient (~20-100x training speedup relative to state-of-the-art) transformer trained on the CRPS objective can directly outperform operational numerical weather prediction (NWP) models and be competitive with ML models that rely on expensive multi-step diffusion processes or require bespoke multi-stage fine-tuning strategies. We further demonstrate state-of-the-art economic value metrics under decision-theoretic evaluation, stable convergence to climatology at S2S and seasonal timescales, and a surprising insensitivity to many commonly assumed architectural and training design choices.
Problem

Research questions and friction points this paper is trying to address.

weather forecasting
decision-centric variables
functional estimation
post-processing bias
probabilistic modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Probabilistic Transformer
Joint Modeling
Decision-Centric Forecasting
CRPS Optimization
Lightweight Weather Model
🔎 Similar Papers
No similar papers found.
Paulius Rauba
Paulius Rauba
University of Cambridge
Machine learning
V
Viktor Cikojevic
F
Fran Bartolic
S
Sam Levang
T
Ty Dickinson
C
Chase Dwelle