🤖 AI Summary
Existing low-light enhancement methods suffer from poor generalization, limiting their applicability across diverse domains such as autonomous driving and 3D reconstruction. To address this, we propose the first Mixture-of-Experts (MoE) framework for low-light enhancement, featuring a gating-driven dynamic weighting mechanism. Our architecture comprises three parallel experts, each specialized for distinct enhancement subtasks; a learnable gating module enables cross-domain adaptive weight allocation; and each expert incorporates a local-global multi-scale feature fusion module. The entire model is end-to-end trainable. Extensive experiments demonstrate state-of-the-art generalization: our method achieves the best overall performance among 25 competing approaches. It attains SOTA PSNR on five benchmarks and SOTA SSIM on four benchmarks, significantly improving cross-scene robustness and task versatility.
📝 Abstract
Low-light enhancement has wide applications in autonomous driving, 3D reconstruction, remote sensing, surveillance, and so on, which can significantly improve information utilization. However, most existing methods lack generalization and are limited to specific tasks such as image recovery. To address these issues, we propose extbf{Gated-Mechanism Mixture-of-Experts (GM-MoE)}, the first framework to introduce a mixture-of-experts network for low-light image enhancement. GM-MoE comprises a dynamic gated weight conditioning network and three sub-expert networks, each specializing in a distinct enhancement task. Combining a self-designed gated mechanism that dynamically adjusts the weights of the sub-expert networks for different data domains. Additionally, we integrate local and global feature fusion within sub-expert networks to enhance image quality by capturing multi-scale features. Experimental results demonstrate that the GM-MoE achieves superior generalization with respect to 25 compared approaches, reaching state-of-the-art performance on PSNR on 5 benchmarks and SSIM on 4 benchmarks, respectively.