SeagrassFinder: Deep Learning for Eelgrass Detection and Coverage Estimation in the Wild

📅 2024-12-20
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the time-consuming, subjective, and non-scalable nature of manual seagrass coverage assessment, this study proposes a deep learning–based automated method for *Zostera marina* (eelgrass) identification and coverage quantification. First, we construct the first large-scale, high-quality underwater eelgrass image dataset, comprising over 8,300 annotated images. Second, we design a lightweight vision Transformer for binary classification, achieving an AUROC > 0.95 on the test set. Third, we introduce a novel video-level coverage estimation framework that integrates frame-wise predictions with spatiotemporal weighted aggregation, yielding estimates highly consistent with expert annotations (Pearson correlation > 0.92). The method enables efficient, large-scale processing of underwater video data, significantly enhancing monitoring granularity, objectivity, and scalability. It provides a robust technical foundation for dynamic marine ecosystem assessment.

Technology Category

Application Category

📝 Abstract
Seagrass meadows play a crucial role in marine ecosystems, providing benefits such as carbon sequestration, water quality improvement, and habitat provision. Monitoring the distribution and abundance of seagrass is essential for environmental impact assessments and conservation efforts. However, the current manual methods of analyzing underwater video data to assess seagrass coverage are time-consuming and subjective. This work explores the use of deep learning models to automate the process of seagrass detection and coverage estimation from underwater video data. We create a new dataset of over 8,300 annotated underwater images, and subsequently evaluate several deep learning architectures, including ResNet, InceptionNetV3, DenseNet, and Vision Transformer for the task of binary classification on the presence and absence of seagrass by transfer learning. The results demonstrate that deep learning models, particularly Vision Transformers, can achieve high performance in predicting eelgrass presence, with AUROC scores exceeding 0.95 on the final test dataset. The application of underwater image enhancement further improved the models' prediction capabilities. Furthermore, we introduce a novel approach for estimating seagrass coverage from video data, showing promising preliminary results that align with expert manual labels, and indicating potential for consistent and scalable monitoring. The proposed methodology allows for the efficient processing of large volumes of video data, enabling the acquisition of much more detailed information on seagrass distributions in comparison to current manual methods. This information is crucial for environmental impact assessments and monitoring programs, as seagrasses are important indicators of coastal ecosystem health. This project demonstrates the value that deep learning can bring to the field of marine ecology and environmental monitoring.
Problem

Research questions and friction points this paper is trying to address.

Automating seagrass detection from underwater videos using deep learning
Estimating seagrass coverage efficiently to replace manual methods
Enhancing marine ecosystem monitoring with scalable AI-driven solutions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep learning models automate seagrass detection
Vision Transformers achieve high AUROC scores
Novel video coverage estimation aligns with experts
🔎 Similar Papers
No similar papers found.
J
Jannik Elsasser
DHI Group, Hørsholm, Denmark
L
Laura Weihl
IT University of Copenhagen, Copenhagen, Denmark
Veronika Cheplygina
Veronika Cheplygina
IT University Copenhagen
meta-researchpattern recognitionmachine learningmedical imagingopen science
L
Lisbeth Tangaa Nielsen
DHI Group, Hørsholm, Denmark