Early Detection of Human Handover Intentions in Human-Robot Collaboration: Comparing EEG, Gaze, and Hand Motion

📅 2025-02-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In human-robot collaboration (HRC) object handover tasks, existing motion-trajectory-based intent recognition methods suffer from trajectory ambiguity and overlap, leading to delayed or erroneous predictions. To overcome this limitation, this work departs from motion-dependent paradigms and systematically compares—within a unified experimental framework—the early intent recognition capability of three non-motor physiological signals: EEG, eye movements, and hand kinematics. Using multimodal synchronized acquisition, temporal feature extraction, and LSTM/SVM classification, combined with cross-subject transfer learning and early-stopping evaluation, we demonstrate that eye movement signals achieve superior performance: average detection 320 ms prior to movement onset with 94.2% classification accuracy—significantly outperforming EEG (86.7%) and hand-motion features (89.1%). This study establishes a novel, low-latency, and robust paradigm for HRC intent understanding, empirically validating eye tracking as a highly effective modality for anticipatory human intent inference.

Technology Category

Application Category

📝 Abstract
Human-robot collaboration (HRC) relies on accurate and timely recognition of human intentions to ensure seamless interactions. Among common HRC tasks, human-to-robot object handovers have been studied extensively for planning the robot's actions during object reception, assuming the human intention for object handover. However, distinguishing handover intentions from other actions has received limited attention. Most research on handovers has focused on visually detecting motion trajectories, which often results in delays or false detections when trajectories overlap. This paper investigates whether human intentions for object handovers are reflected in non-movement-based physiological signals. We conduct a multimodal analysis comparing three data modalities: electroencephalogram (EEG), gaze, and hand-motion signals. Our study aims to distinguish between handover-intended human motions and non-handover motions in an HRC setting, evaluating each modality's performance in predicting and classifying these actions before and after human movement initiation. We develop and evaluate human intention detectors based on these modalities, comparing their accuracy and timing in identifying handover intentions. To the best of our knowledge, this is the first study to systematically develop and test intention detectors across multiple modalities within the same experimental context of human-robot handovers. Our analysis reveals that handover intention can be detected from all three modalities. Nevertheless, gaze signals are the earliest as well as the most accurate to classify the motion as intended for handover or non-handover.
Problem

Research questions and friction points this paper is trying to address.

Detects human handover intentions in HRC
Compares EEG, gaze, and hand-motion signals
Develops multimodal human intention detectors
Innovation

Methods, ideas, or system contributions that make the work stand out.

EEG, gaze, hand-motion analysis
Multimodal human intention detection
Early handover intention classification
🔎 Similar Papers
No similar papers found.