The Collection of a Human Robot Collaboration Dataset for Cooperative Assembly in Glovebox Environments

📅 2024-07-19
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
The scarcity of high-quality, real-world hand-glove image datasets hinders safety and robustness in human-robot collaborative assembly in industrial settings. Method: We introduce HAGS—the first benchmark dataset for glove-box environments—comprising over 12K real multi-view frames with pixel-level hand/glove segmentation masks and, uniquely, quantified annotation uncertainty. To assess out-of-distribution robustness, we propose a chroma-key augmentation strategy to synthesize realistic adversarial samples. We further establish a real-time semantic segmentation benchmark to systematically evaluate state-of-the-art models. Results: Experiments reveal substantial performance degradation of existing methods under industrial conditions. HAGS fills a critical gap by providing the first real-world industrial hand dataset with uncertainty-aware annotations, offering a new open-source benchmark for safe human-robot collaboration.

Technology Category

Application Category

📝 Abstract
Industry 4.0 introduced AI as a transformative solution for modernizing manufacturing processes. Its successor, Industry 5.0, envisions humans as collaborators and experts guiding these AI-driven manufacturing solutions. Developing these techniques necessitates algorithms capable of safe, real-time identification of human positions in a scene, particularly their hands, during collaborative assembly. Although substantial efforts have curated datasets for hand segmentation, most focus on residential or commercial domains. Existing datasets targeting industrial settings predominantly rely on synthetic data, which we demonstrate does not effectively transfer to real-world operations. Moreover, these datasets lack uncertainty estimations critical for safe collaboration. Addressing these gaps, we present HAGS: Hand and Glove Segmentation Dataset. This dataset provides challenging examples to build applications toward hand and glove segmentation in industrial human-robot collaboration scenarios as well as assess out-of-distribution images, constructed via green screen augmentations, to determine ML-classifier robustness. We study state-of-the-art, real-time segmentation models to evaluate existing methods. Our dataset and baselines are publicly available.
Problem

Research questions and friction points this paper is trying to address.

Hand and Glove Image Dataset
Machine Learning Model Training
Safe Human-Robot Collaboration
Innovation

Methods, ideas, or system contributions that make the work stand out.

HAGS Dataset
Hand and Glove Recognition
Human-Robot Collaboration Safety
🔎 Similar Papers
No similar papers found.
S
Shivansh Sharma
Nuclear and Applied Robotics Group, Department of Mechanical Engineering
M
Mathew Huang
Nuclear and Applied Robotics Group, Department of Mechanical Engineering
S
Sanat Nair
Nuclear and Applied Robotics Group, Department of Mechanical Engineering
A
Alan Wen
Nuclear and Applied Robotics Group, Department of Mechanical Engineering
Christina Petlowany
Christina Petlowany
Nuclear and Applied Robotics Group, Department of Mechanical Engineering
Juston Moore
Juston Moore
Los Alamos National Laboratory
Adversarial Machine LearningAnomaly Detection
S
Selma Wanna
Nuclear and Applied Robotics Group, Department of Mechanical Engineering; Advanced Research in Cyber Systems, Los Alamos National Laboratory
M
Mitch Pryor
Nuclear and Applied Robotics Group, Department of Mechanical Engineering