Hierarchical Graph Networks for Accurate Weather Forecasting via Lightweight Training

πŸ“… 2025-10-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Weather forecasting faces challenges due to the complex coupling of physical processes across spatiotemporal scales; fixed-resolution models struggle to capture multiscale dynamics, while existing hierarchical graph neural networks (HGNNs) often lose global trends during downsampling, compromising physical consistency. To address this, we propose HiAntFlowβ€”a multiscale graph neural network framework. Its key innovations are: (1) a Latent-Memory-Retention mechanism that preserves continuity of global climate dynamical features during downward information propagation; and (2) a Latent-to-Physics branch that explicitly incorporates PDE-derived physical fields for cross-scale physical embedding. Integrated with pretraining initialization and a lightweight ensemble training strategy, HiAntFlow achieves >5% average error reduction in 13-day forecasts and 5–8% improvement in extreme-event prediction (1st/99th percentile). It converges within a single training run, significantly reducing computational cost and carbon footprint.

Technology Category

Application Category

πŸ“ Abstract
Climate events arise from intricate, multivariate dynamics governed by global-scale drivers, profoundly impacting food, energy, and infrastructure. Yet, accurate weather prediction remains elusive due to physical processes unfolding across diverse spatio-temporal scales, which fixed-resolution methods cannot capture. Hierarchical Graph Neural Networks (HGNNs) offer a multiscale representation, but nonlinear downward mappings often erase global trends, weakening the integration of physics into forecasts. We introduce HiFlowCast and its ensemble variant HiAntFlow, HGNNs that embed physics within a multiscale prediction framework. Two innovations underpin their design: a Latent-Memory-Retention mechanism that preserves global trends during downward traversal, and a Latent-to-Physics branch that integrates PDE solution fields across diverse scales. Our Flow models cut errors by over 5% at 13-day lead times and by 5-8% under 1st and 99th quantile extremes, improving reliability for rare events. Leveraging pretrained model weights, they converge within a single epoch, reducing training cost and their carbon footprint. Such efficiency is vital as the growing scale of machine learning challenges sustainability and limits research accessibility. Code and model weights are in the supplementary materials.
Problem

Research questions and friction points this paper is trying to address.

Capturing multiscale weather dynamics missed by fixed-resolution methods
Preserving global climate trends during hierarchical model downscaling
Reducing computational costs while maintaining extreme event prediction accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hierarchical Graph Networks embed physics in multiscale framework
Latent-Memory-Retention preserves global trends during downward traversal
Latent-to-Physics branch integrates PDE solutions across diverse scales
πŸ”Ž Similar Papers
No similar papers found.