Rethinking Model Redundancy for Low-light Image Enhancement

📅 2024-12-21
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing low-light image enhancement (LLIE) methods rely on over-parameterized deep models, suffering from two types of parameter redundancy: *parameter harmfulness*—where certain parameters degrade performance—and *parameter uselessness*—where others contribute negligibly to representation learning. This work is the first to systematically identify and characterize both redundancies in LLIE models. To address them, we propose a Dynamic Attention Redistribution (ADR) mechanism that dynamically recalibrates attention weights to suppress harmful feature responses. Furthermore, we introduce a Parameter Orthogonal Generation (POG) framework, which employs orthogonal embedding and decoupled modeling to achieve compact yet expressive representations. Extensive experiments demonstrate that our method achieves state-of-the-art PSNR and SSIM scores across multiple benchmarks, while reducing model parameters by 18% and accelerating inference by 23%. The source code is publicly available.

Technology Category

Application Category

📝 Abstract
Low-light image enhancement (LLIE) is a fundamental task in computational photography, aiming to improve illumination, reduce noise, and enhance the image quality of low-light images. While recent advancements primarily focus on customizing complex neural network models, we have observed significant redundancy in these models, limiting further performance improvement. In this paper, we investigate and rethink the model redundancy for LLIE, identifying parameter harmfulness and parameter uselessness. Inspired by the rethinking, we propose two innovative techniques to mitigate model redundancy while improving the LLIE performance: Attention Dynamic Reallocation (ADR) and Parameter Orthogonal Generation (POG). ADR dynamically reallocates appropriate attention based on original attention, thereby mitigating parameter harmfulness. POG learns orthogonal basis embeddings of parameters and prevents degradation to static parameters, thereby mitigating parameter uselessness. Experiments validate the effectiveness of our techniques. We will release the code to the public.
Problem

Research questions and friction points this paper is trying to address.

Addressing model redundancy in low-light image enhancement
Mitigating parameter harmfulness and uselessness in neural networks
Improving performance through attention reallocation and orthogonal generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Attention Dynamic Reallocation mitigates parameter harmfulness
Parameter Orthogonal Generation prevents degradation to static parameters
Techniques reduce model redundancy while improving enhancement performance
🔎 Similar Papers
No similar papers found.
T
Tong Li
Beijing Institute of Technology
L
Lizhi Wang
Beijing Normal University
Hansen Feng
Hansen Feng
Beijing Institute of Technology
denoisingsuper-resolutionimage restorationImage and video processingComputational
L
Lin Zhu
Beijing Institute of Technology
W
Wanxuan Lu
Chinese Academy of Sciences
H
Hua Huang
Beijing Normal University