🤖 AI Summary
Autonomous perception in marine environments faces significant challenges due to dynamic water surfaces, low visibility, and the complex coupling of cross-medium multimodal sensing. To address these issues, this work introduces and publicly releases the AMP2026 dataset, which for the first time synchronously integrates multi-perspective perception data from three distinct platforms: aerial drones, surface vessels, and underwater robots. The dataset fuses visual, localization, and telemetry information with precise cross-medium multimodal alignment. AMP2026 establishes a standardized benchmark for research in multi-view object tracking, marine environment mapping, and multi-robot collaborative observation, thereby supporting the advancement of intelligent marine systems.
📝 Abstract
Marine environments present significant challenges for perception and autonomy due to dynamic surfaces, limited visibility, and complex interactions between aerial, surface, and submerged sensing modalities. This paper introduces the Aerial Marine Perception Dataset (AMP2026), a multi-platform marine robotics dataset collected across multiple field deployments designed to support research in two primary areas: multi-view tracking and marine environment mapping. The dataset includes synchronized data from aerial drones, boat-mounted cameras, and submerged robotic platforms, along with associated localization and telemetry information. The goal of this work is to provide a publicly available dataset enabling research in marine perception and multi-robot observation scenarios. This paper describes the data collection methodology, sensor configurations, dataset organization, and intended research tasks supported by the dataset.