One-Shot Badminton Shuttle Detection for Mobile Robots

๐Ÿ“… 2026-03-04
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the lack of efficient badminton shuttlecock detection methods for mobile robots operating under dynamic egocentric viewpoints by proposing the first single-stage detection framework tailored to this scenario. The authors construct the first large-scale, multi-scenario badminton dataset with difficulty-level annotations and employ a semi-automatic labeling pipeline to enhance data quality. Building upon an optimized YOLOv8 architecture and introducing custom evaluation metrics aligned with downstream tasks, the framework achieves end-to-end real-time detection. Experimental results demonstrate strong performance, with an F1-score of 0.86 in similar environments and a robust 0.70 in completely unseen settings, confirming the modelโ€™s robustness and generalization capability under dynamic viewpoints.

Technology Category

Application Category

๐Ÿ“ Abstract
This paper presents a robust one-shot badminton shuttlecock detection framework for non-stationary robots. To address the lack of egocentric shuttlecock detection datasets, we introduce a dataset of 20,510 semi-automatically annotated frames captured across 11 distinct backgrounds in diverse indoor and outdoor environments, and categorize each frame into one of three difficulty levels. For labeling, we present a novel semi-automatic annotation pipeline, that enables efficient labeling from stationary camera footage. We propose a metric suited to our downstream use case and fine-tune a YOLOv8 network optimized for real-time shuttlecock detection, achieving an F1-score of 0.86 under our metric in test environments similar to training, and 0.70 in entirely unseen environments. Our analysis reveals that detection performance is critically dependent on shuttlecock size and background texture complexity. Qualitative experiments confirm their applicability to robots with moving cameras. Unlike prior work with stationary camera setups, our detector is specifically designed for the egocentric, dynamic viewpoints of mobile robots, providing a foundational building block for downstream tasks, including tracking, trajectory estimation, and system (re)-initialization.
Problem

Research questions and friction points this paper is trying to address.

one-shot detection
badminton shuttlecock
mobile robots
egocentric vision
dynamic viewpoint
Innovation

Methods, ideas, or system contributions that make the work stand out.

one-shot detection
egocentric vision
mobile robot perception
semi-automatic annotation
YOLOv8 optimization
๐Ÿ”Ž Similar Papers
No similar papers found.