Omni-Weather: Unified Multimodal Foundation Model for Weather Generation and Understanding

📅 2025-12-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing weather modeling treats forecasting and mechanistic interpretation as disjoint tasks. This paper introduces the first unified multimodal foundation model for both meteorological generation and understanding. We design a radar encoder coupled with shared multi-head self-attention to jointly optimize generation (e.g., radar echo extrapolation) and understanding tasks (e.g., phenomenon attribution, trend diagnosis). We propose the first meteorological Chain-of-Thought dataset and a causal reasoning training paradigm that explicitly encodes physical causality. A multi-task joint optimization framework enables bidirectional collaborative transfer between generation and understanding. Experiments demonstrate state-of-the-art performance on both task families, with statistically significant positive transfer gains—confirming synergistic learning. To our knowledge, this is the first architecture to simultaneously achieve high-fidelity spatiotemporal forecasting and physically grounded, interpretable causal inference.

Technology Category

Application Category

📝 Abstract
Weather modeling requires both accurate prediction and mechanistic interpretation, yet existing methods treat these goals in isolation, separating generation from understanding. To address this gap, we present Omni-Weather, the first multimodal foundation model that unifies weather generation and understanding within a single architecture. Omni-Weather integrates a radar encoder for weather generation tasks, followed by unified processing using a shared self-attention mechanism. Moreover, we construct a Chain-of-Thought dataset for causal reasoning in weather generation, enabling interpretable outputs and improved perceptual quality. Extensive experiments show Omni-Weather achieves state-of-the-art performance in both weather generation and understanding. Our findings further indicate that generative and understanding tasks in the weather domain can mutually enhance each other. Omni-Weather also demonstrates the feasibility and value of unifying weather generation and understanding.
Problem

Research questions and friction points this paper is trying to address.

Unifies weather generation and understanding in a single model
Integrates radar encoding and shared attention for multimodal tasks
Enables interpretable outputs through causal reasoning datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified multimodal model for weather generation and understanding
Shared self-attention mechanism for unified processing
Chain-of-Thought dataset for interpretable causal reasoning
🔎 Similar Papers
No similar papers found.
Z
Zhiwang Zhou
Tongji University
Yuandong Pu
Yuandong Pu
SJTU,Shanghai AI Laboratory
Computer Vision
X
Xuming He
Shanghai AI Laboratory
Yidi Liu
Yidi Liu
University of Science and Technology of China
computer vision
Y
Yixin Chen
Shanghai AI Laboratory
J
Junchao Gong
Shanghai AI Laboratory
Xiang Zhuang
Xiang Zhuang
Ph.D. student, Zhejiang University
W
Wanghan Xu
Shanghai AI Laboratory
Q
Qinglong Cao
Shanghai AI Laboratory
S
Shixiang Tang
Shanghai AI Laboratory
Y
Yihao Liu
Shanghai AI Laboratory
W
Wenlong Zhang
Shanghai AI Laboratory
Lei Bai
Lei Bai
Shanghai AI Laboratory
Foundation ModelScience IntelligenceMulti-Agent SystemAutonomous Discovery