🤖 AI Summary
This study addresses the high computational cost of radiative transfer calculations in numerical weather prediction, the inefficiency of traditional physical parameterizations, and the challenges machine learning models face regarding compatibility and long-term integration stability in operational systems. To overcome these issues, the authors propose a residual convolutional neural network–based surrogate model, which is trained offline and then integrated online into the China Meteorological Administration’s global operational system via LibTorch. The approach incorporates physics-informed regularization and an experience replay mechanism to ensure physically plausible outputs and data stability. In two-month retrospective experiments, the model achieves accuracy comparable to the RRTMG scheme while accelerating computations by approximately eightfold. Notably, it successfully performs ten-day continuous integrations, thereby overcoming the stability bottleneck that has hindered hybrid models in long-term forecasting applications.
📝 Abstract
Radiation is typically the most time-consuming physical process in numerical models. One solution is to use machine learning methods to simulate the radiation process to improve computational efficiency. From an operational standpoint, this study investigates critical limitations inherent to hybrid forecasting frameworks that embed deep neural networks into numerical prediction models, with a specific focus on two fundamental bottlenecks: coupling compatibility and long-term integration stability. A residual convolutional neural network is employed to approximate the Rapid Radiative Transfer Model for General Circulation Models (RRTMG) within the global operational system of China Meteorological Administration. We adopted an offline training and online coupling approach. First, a comprehensive dataset is generated through model simulations, encompassing all atmospheric columns both with and without cloud cover. To ensure the stability of the hybrid model, the dataset is enhanced via experience replay, and additional output constraints based on physical significance are imposed. Meanwhile, a LibTorch-based coupling method is utilized, which is more suitable for real-time operational computations. The hybrid model is capable of performing ten-day integrated forecasts as required. A two-month operational reforecast experiment demonstrates that the machine learning emulator achieves accuracy comparable to that of the traditional physical scheme, while accelerating the computation speed by approximately eightfold.