🤖 AI Summary
To address the challenge of high-precision online pose estimation and infrastructure mapping for autonomous underwater vehicle (AUV) navigation in complex kelp farm environments, this paper proposes a novel side-scan sonar (SSS)-based underwater SLAM framework. Our method innovatively models cultivation ropes as sparse sequences of ping-level sonar returns—treated as discrete point landmarks—thereby avoiding error accumulation inherent in conventional elongated feature fusion and enhancing geometric consistency and robustness. The backend optimization operates directly on raw ping-level detection measurements, enabling tightly coupled geometric structure mapping. Evaluated in real-world hardware-in-the-loop experiments within an operational kelp farm, our approach outperforms state-of-the-art underwater SLAM methods, achieving a 32% improvement in localization accuracy and a 41% increase in map completeness. The source code and benchmark dataset are publicly released.
📝 Abstract
The transition of seaweed farming to an alternative food source on an industrial scale relies on automating its processes through smart farming, equivalent to land agriculture. Key to this process are autonomous underwater vehicles (AUVs) via their capacity to automate crop and structural inspections. However, the current bottleneck for their deployment is ensuring safe navigation within farms, which requires an accurate, online estimate of the AUV pose and map of the infrastructure. To enable this, we propose an efficient side scan sonar-based (SSS) simultaneous localization and mapping (SLAM) framework that exploits the geometry of kelp farms via modeling structural ropes in the back-end as sequences of individual landmarks from each SSS ping detection, instead of combining detections into elongated representations. Our method outperforms state of the art solutions in hardware in the loop (HIL) experiments on a real AUV survey in a kelp farm. The framework and dataset can be found at https://github.com/julRusVal/sss_farm_slam.