🤖 AI Summary
This work addresses the challenges of high-quality 3D reconstruction and novel view synthesis from underwater sonar images, which are hindered by complex noise patterns and the absence of elevation information. To this end, the authors propose a noise-aware Gaussian splatting framework tailored to the characteristics of sonar imaging. Their approach introduces a Gaussian Mixture Model (GMM) for the first time to explicitly model sonar-specific noise and incorporates a bidirectional splatting mechanism that jointly accounts for intensity accumulation and transmission effects. By doing so, the method effectively mitigates overfitting of Gaussians to noise, achieving significantly improved reconstruction accuracy and rendering fidelity on both simulated and real-world large-scale maritime scenes, while maintaining high rendering efficiency.
📝 Abstract
Underwater sonar imaging plays a crucial role in various applications, including autonomous navigation in murky water, marine archaeology, and environmental monitoring. However, the unique characteristics of sonar images, such as complex noise patterns and the lack of elevation information, pose significant challenges for 3D reconstruction and novel view synthesis. In this paper, we present NAS-GS, a novel Noise-Aware Sonar Gaussian Splatting framework specifically designed to address these challenges. Our approach introduces a Two-Ways Splatting technique that accurately models the dual directions for intensity accumulation and transmittance calculation inherent in sonar imaging, significantly improving rendering speed without sacrificing quality. Moreover, we propose a Gaussian Mixture Model (GMM) based noise model that captures complex sonar noise patterns, including side-lobes, speckle, and multi-path noise. This model enhances the realism of synthesized images while preventing 3D Gaussian overfitting to noise, thereby improving reconstruction accuracy. We demonstrate state-of-the-art performance on both simulated and real-world large-scale offshore sonar scenarios, achieving superior results in novel view synthesis and 3D reconstruction.