🤖 AI Summary
To address the demand for lightweight color image denoising on resource-constrained edge devices, this paper proposes the first channel-aware lookup table (LUT)-based ultra-low-overhead denoising framework. Methodologically, we design a Pairwise Channel Mixer to jointly model inter-channel correlations and spatial dependencies; introduce an L-shaped sparse convolution to enlarge the receptive field while reducing parameter count; and quantize all modules offline into efficient LUTs for table-lookup inference. Our contributions include: a model requiring only 500 KB of storage, consuming 0.1% the energy of DnCNN and achieving 20× faster inference; state-of-the-art PSNR—over 1 dB higher than existing LUT-based methods—under extreme resource constraints; and, for the first time, the unification of high-fidelity denoising performance with sub-kilobyte deployment footprint.
📝 Abstract
While deep neural networks have revolutionized image denoising capabilities, their deployment on edge devices remains challenging due to substantial computational and memory requirements. To this end, we present DnLUT, an ultra-efficient lookup table-based framework that achieves high-quality color image denoising with minimal resource consumption. Our key innovation lies in two complementary components: a Pairwise Channel Mixer (PCM) that effectively captures inter-channel correlations and spatial dependencies in parallel, and a novel L-shaped convolution design that maximizes receptive field coverage while minimizing storage overhead. By converting these components into optimized lookup tables post-training, DnLUT achieves remarkable efficiency - requiring only 500KB storage and 0.1% energy consumption compared to its CNN contestant DnCNN, while delivering 20X faster inference. Extensive experiments demonstrate that DnLUT outperforms all existing LUT-based methods by over 1dB in PSNR, establishing a new state-of-the-art in resource-efficient color image denoising. The project is available at https://github.com/Stephen0808/DnLUT.