SemanticSugarBeets: A Multi-Task Framework and Dataset for Inspecting Harvest and Storage Characteristics of Sugar Beets

📅 2025-04-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address sugar loss in stored sugar beets caused by microbial infection, soil adhesion, rot, and redundant leaves, this paper introduces the first high-quality, multi-annotated visual quality inspection dataset for sugar beets. We propose a two-stage detection-segmentation framework integrating YOLOv8 and Mask R-CNN to achieve beet localization and fine-grained semantic segmentation—distinguishing damage, rot, soil adhesion, and redundant leaves—from single-view RGB images, enabling accurate quality estimation. Our key contribution is establishing the first agricultural fine-grained segmentation benchmark with multi-task evaluation, systematically analyzing the impact of environmental variables and architectural design choices on segmentation performance. Experiments demonstrate state-of-the-art results: 98.8% mAP₅₀₋₉₅ for detection and 64.0% mIoU for segmentation—significantly improving sugar loss prediction accuracy and efficiency of automated grading and processing.

Technology Category

Application Category

📝 Abstract
While sugar beets are stored prior to processing, they lose sugar due to factors such as microorganisms present in adherent soil and excess vegetation. Their automated visual inspection promises to aide in quality assurance and thereby increase efficiency throughout the processing chain of sugar production. In this work, we present a novel high-quality annotated dataset and two-stage method for the detection, semantic segmentation and mass estimation of post-harvest and post-storage sugar beets in monocular RGB images. We conduct extensive ablation experiments for the detection of sugar beets and their fine-grained semantic segmentation regarding damages, rot, soil adhesion and excess vegetation. For these tasks, we evaluate multiple image sizes, model architectures and encoders, as well as the influence of environmental conditions. Our experiments show an mAP50-95 of 98.8 for sugar-beet detection and an mIoU of 64.0 for the best-performing segmentation model.
Problem

Research questions and friction points this paper is trying to address.

Automated visual inspection of sugar beet quality
Detection and segmentation of damages and rot
Mass estimation of post-harvest sugar beets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-stage method for detection and segmentation
High-quality annotated dataset for sugar beets
Extensive ablation experiments on model architectures
🔎 Similar Papers
No similar papers found.