🤖 AI Summary
Underwater image enhancement is challenging due to the absence of ground-truth references and spatially non-uniform degradation across semantic regions, which often leads to color distortion and detail loss when applying global processing strategies. To address this, this work proposes SUCode, a novel network that introduces, for the first time, a semantic-aware pixel-wise discrete codebook to adaptively model heterogeneous degradations. The method employs a three-stage training strategy to mitigate pseudo-label contamination and integrates a channel attention module (GCAM) with a frequency-aware feature fusion mechanism (FAFF). Extensive experiments demonstrate that SUCode significantly outperforms state-of-the-art methods across multiple benchmark datasets, achieving leading performance on both full-reference and no-reference evaluation metrics.
📝 Abstract
Underwater Image Enhancement (UIE) is an ill-posed problem where natural clean references are not available, and the degradation levels vary significantly across semantic regions. Existing UIE methods treat images with a single global model and ignore the inconsistent degradation of different scene components. This oversight leads to significant color distortions and loss of fine details in heterogeneous underwater scenes, especially where degradation varies significantly across different image regions. Therefore, we propose SUCode (Semantic-aware Underwater Codebook Network), which achieves adaptive UIE from semantic-aware discrete codebook representation. Compared with one-shot codebook-based methods, SUCode exploits semantic-aware, pixel-level codebook representation tailored to heterogeneous underwater degradation. A three-stage training paradigm is employed to represent raw underwater image features to avoid pseudo ground-truth contamination. Gated Channel Attention Module (GCAM) and Frequency-Aware Feature Fusion (FAFF) jointly integrate channel and frequency cues for faithful color restoration and texture recovery. Extensive experiments on multiple benchmarks demonstrate that SUCode achieves state-of-the-art performance, outperforming recent UIE methods on both reference and no-reference metrics. The code will be made public available at https://github.com/oucailab/SUCode.