🤖 AI Summary
Nighttime image deraining is highly challenging due to the strong coupling between rain streaks and low illumination, as well as the scarcity of high-quality labeled data. To address these issues, we introduce HQ-NightRain—the first high-fidelity benchmark specifically designed for nighttime deraining—and propose CST-Net. Our method enhances rain modeling in the Y channel via a learnable color-space transformation, integrates a Y-channel attention mechanism, and incorporates an implicit illumination-guided module to explicitly decouple and jointly optimize rain removal and illumination recovery. Extensive experiments on multiple nighttime deraining benchmarks demonstrate that CST-Net significantly outperforms existing state-of-the-art methods. Both quantitative metrics and qualitative visual results confirm its superior capability in eliminating complex rain artifacts while preserving fine details and structural integrity. The code and dataset are publicly available.
📝 Abstract
Compared to daytime image deraining, nighttime image deraining poses significant challenges due to inherent complexities of nighttime scenarios and the lack of high-quality datasets that accurately represent the coupling effect between rain and illumination. In this paper, we rethink the task of nighttime image deraining and contribute a new high-quality benchmark, HQ-NightRain, which offers higher harmony and realism compared to existing datasets. In addition, we develop an effective Color Space Transformation Network (CST-Net) for better removing complex rain from nighttime scenes. Specifically, we propose a learnable color space converter (CSC) to better facilitate rain removal in the Y channel, as nighttime rain is more pronounced in the Y channel compared to the RGB color space. To capture illumination information for guiding nighttime deraining, implicit illumination guidance is introduced enabling the learned features to improve the model's robustness in complex scenarios. Extensive experiments show the value of our dataset and the effectiveness of our method. The source code and datasets are available at https://github.com/guanqiyuan/CST-Net.