๐ค AI Summary
Existing methods for restoring images degraded by multiple adverse weather conditions (e.g., fog, rain, snow) typically model only a single degradation type, resulting in limited generalization. To address this, we propose the first end-to-end unified weather removal framework. Our method introduces a novel texture-structure collaborative decomposition moduleโmarking the first integration of decomposition principles into generic weather restoration. We further design a lightweight quaternion-based encoder-decoder Transformer architecture, incorporating attention-based feature fusion and adaptive multi-degradation modeling. A quaternion similarity loss is proposed to preserve color fidelity, and a low-light correction mechanism is seamlessly integrated. Extensive experiments on multiple benchmark datasets and real-world scenes demonstrate significant superiority over state-of-the-art methods, effectively removing compound degradations (e.g., fog combined with rain streaks) and notably improving downstream object detection accuracy.
๐ Abstract
Images used in real-world applications such as image or video retrieval, outdoor surveillance, and autonomous driving suffer from poor weather conditions. When designing robust computer vision systems, removing adverse weather such as haze, rain, and snow is a significant problem. Recently, deep-learning methods offered a solution for a single type of degradation. Current state-of-the-art universal methods struggle with combinations of degradations, such as haze and rain-streak. Few algorithms have been developed that perform well when presented with images containing multiple adverse weather conditions. This work focuses on developing an efficient solution for multiple adverse weather removal using a unified quaternion neural architecture called CMAWRNet. It is based on a novel texture-structure decomposition block, a novel lightweight encoder-decoder quaternion transformer architecture, and an attentive fusion block with low-light correction. We also introduce a quaternion similarity loss function to preserve color information better. The quantitative and qualitative evaluation of the current state-of-the-art benchmarking datasets and real-world images shows the performance advantages of the proposed CMAWRNet compared to other state-of-the-art weather removal approaches dealing with multiple weather artifacts. Extensive computer simulations validate that CMAWRNet improves the performance of downstream applications such as object detection. This is the first time the decomposition approach has been applied to the universal weather removal task.