🤖 AI Summary
Numerical weather prediction (NWP) suffers from high computational cost and physical inconsistency, while purely data-driven deep learning models lack physical interpretability and generalizability. To address these challenges, this paper proposes PhyDL-NWP—a physics-guided continuous spatiotemporal deep learning framework. PhyDL-NWP innovatively embeds governing equations and implicit forcing parameterizations into end-to-end learning, leveraging automatic differentiation, physics-informed loss functions, NeRF-style continuous neural representations, and a lightweight parameterization architecture. This enables resolution-agnostic field modeling and efficient downscaling. With only 55K parameters, the model achieves a 170× speedup in inference over traditional NWP. It significantly improves forecast accuracy and enhances conservation of mass and energy. Moreover, it supports arbitrary-resolution short-term forecasting and enables lightweight fine-tuning of pre-trained models.
📝 Abstract
Weather forecasting is essential but remains computationally intensive and physically incomplete in traditional numerical weather prediction (NWP) methods. Deep learning (DL) models offer efficiency and accuracy but often ignore physical laws, limiting interpretability and generalization. We propose PhyDL-NWP, a physics-guided deep learning framework that integrates physical equations with latent force parameterization into data-driven models. It predicts weather variables from arbitrary spatiotemporal coordinates, computes physical terms via automatic differentiation, and uses a physics-informed loss to align predictions with governing dynamics. PhyDL-NWP enables resolution-free downscaling by modeling weather as a continuous function and fine-tunes pre-trained models with minimal overhead, achieving up to 170x faster inference with only 55K parameters. Experiments show that PhyDL-NWP improves both forecasting performance and physical consistency.