Gazing at Failure: Investigating Human Gaze in Response to Robot Failure in Collaborative Tasks

📅 2025-02-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates how robot errors—distinguishing execution errors (e.g., motion inaccuracies) from decision errors (e.g., incorrect task planning)—elicit human-specific gaze dynamics during human–robot collaboration, enabling online failure detection and perception modeling. Using Tobii Pro Fusion eye tracking, a UR5e robotic arm integrated with a Jackal mobile base, and a Tangram-based collaborative task, we applied Area-of-Interest (AOI) segmentation and transfer entropy analysis to quantify gaze transitions. Results reveal that decision errors—particularly late in task execution—significantly reduce gaze transfer entropy, whereas execution errors induce more frequent saccades and increased visual fixation on the robot. Machine learning classifiers achieve high accuracy in distinguishing error types (AUC > 0.92) and predicting user-perceived declines in robot trustworthiness. This work establishes gaze behavior as a novel, real-time, and deployable signal for failure identification and adaptive recovery strategy selection.

Technology Category

Application Category

📝 Abstract
Robots are prone to making errors, which can negatively impact their credibility as teammates during collaborative tasks with human users. Detecting and recovering from these failures is crucial for maintaining effective level of trust from users. However, robots may fail without being aware of it. One way to detect such failures could be by analysing humans' non-verbal behaviours and reactions to failures. This study investigates how human gaze dynamics can signal a robot's failure and examines how different types of failures affect people's perception of robot. We conducted a user study with 27 participants collaborating with a robotic mobile manipulator to solve tangram puzzles. The robot was programmed to experience two types of failures -- executional and decisional -- occurring either at the beginning or end of the task, with or without acknowledgement of the failure. Our findings reveal that the type and timing of the robot's failure significantly affect participants' gaze behaviour and perception of the robot. Specifically, executional failures led to more gaze shifts and increased focus on the robot, while decisional failures resulted in lower entropy in gaze transitions among areas of interest, particularly when the failure occurred at the end of the task. These results highlight that gaze can serve as a reliable indicator of robot failures and their types, and could also be used to predict the appropriate recovery actions.
Problem

Research questions and friction points this paper is trying to address.

Human gaze dynamics signal robot failures.
Type and timing of failures affect perception.
Gaze predicts recovery actions for robot failures.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzes human gaze dynamics
Detects robot failures non-verbally
Uses gaze to predict recovery actions
🔎 Similar Papers
No similar papers found.
Ramtin Tabatabaei
Ramtin Tabatabaei
PhD Student, University of Melbourne
Robot FailureHuman Robot CollaborationHuman Gaze
V
V. Kostakos
The University of Melbourne, Melbourne, Australia
W
W. Johal
The University of Melbourne, Melbourne, Australia