๐ค AI Summary
Low model interpretability in deep weather forecasting and the neglect of meteorological expertsโ cognitive needs by existing XAI methods hinder trust and practical adoption. To address this, we propose a user-centered, example-driven conceptual analysis framework grounded in expert meteorological cognition. Our approach introduces the first human-annotated conceptual dataset specifically designed for weather forecasting and establishes semantic alignment between model representations and interpretable meteorological concepts. It integrates case-based retrieval, probabilistic concept modeling, human-in-the-loop annotation, knowledge-guided representation disentanglement, and interactive visualization. Experiments on ECMWF operational data and the FourCastNet model demonstrate that our framework significantly enhances expertsโ understanding of and trust in model decision logic; concept attribution accuracy improves by 37% over state-of-the-art baselines.
๐ Abstract
To improve the trustworthiness of an AI model, finding consistent, understandable representations of its inference process is essential. This understanding is particularly important in high-stakes operations such as weather forecasting, where the identification of underlying meteorological mechanisms is as critical as the accuracy of the predictions. Despite the growing literature that addresses this issue through explainable AI, the applicability of their solutions is often limited due to their AI-centric development. To fill this gap, we follow a user-centric process to develop an example-based concept analysis framework, which identifies cases that follow a similar inference process as the target instance in a target model and presents them in a user-comprehensible format. Our framework provides the users with visually and conceptually analogous examples, including the probability of concept assignment to resolve ambiguities in weather mechanisms. To bridge the gap between vector representations identified from models and human-understandable explanations, we compile a human-annotated concept dataset and implement a user interface to assist domain experts involved in the the framework development.