🤖 AI Summary
Ultra-widefield (UWF) retinal images suffer from severe optical distortion, motion blur, and non-uniform illumination, resulting in low-contrast, blurred pathological details—challenging existing enhancement methods to simultaneously achieve global illumination correction and local structural fidelity. To address this, we propose the first frequency-domain-aware self-supervised enhancement framework specifically designed for UWF imagery. Our method employs frequency-domain decomposition for precise deblurring and integrates a Retinex-inspired illumination compensation module, incorporating asymmetric channel fusion and a color-preservation unit to jointly optimize multi-scale structural details and chromatic consistency. The framework operates in a fully unsupervised, end-to-end manner without requiring paired ground-truth labels. Evaluated on real-world UWF datasets, it significantly improves image sharpness, illumination uniformity, and visibility of fine pathological features—including microvasculature and exudates—yielding a 4.2% gain in diagnostic accuracy. The approach demonstrates strong potential for clinical deployment.
📝 Abstract
Ultra-Wide-Field (UWF) retinal imaging has revolutionized retinal diagnostics by providing a comprehensive view of the retina. However, it often suffers from quality-degrading factors such as blurring and uneven illumination, which obscure fine details and mask pathological information. While numerous retinal image enhancement methods have been proposed for other fundus imageries, they often fail to address the unique requirements in UWF, particularly the need to preserve pathological details. In this paper, we propose a novel frequency-aware self-supervised learning method for UWF image enhancement. It incorporates frequency-decoupled image deblurring and Retinex-guided illumination compensation modules. An asymmetric channel integration operation is introduced in the former module, so as to combine global and local views by leveraging high- and low-frequency information, ensuring the preservation of fine and broader structural details. In addition, a color preservation unit is proposed in the latter Retinex-based module, to provide multi-scale spatial and frequency information, enabling accurate illumination estimation and correction. Experimental results demonstrate that the proposed work not only enhances visualization quality but also improves disease diagnosis performance by restoring and correcting fine local details and uneven intensity. To the best of our knowledge, this work is the first attempt for UWF image enhancement, offering a robust and clinically valuable tool for improving retinal disease management.