Gaze Estimation for Human-Robot Interaction: Analysis Using the NICO Platform

📅 2025-09-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the practicality bottleneck of gaze estimation in human–robot cohabited shared workspaces. We systematically evaluate existing methods by collecting multimodal real-world interaction data on the NICO robotic platform and constructing the first fine-grained, HRI-specific annotated dataset. Departing from conventional angular error metrics, we propose spatial distance error—measured in centimeters—as the primary evaluation criterion, better reflecting task-level interaction requirements. Experimental comparison of four state-of-the-art deep learning models reveals that while angular accuracy approaches general-purpose benchmarks (e.g., MPIIGaze), median spatial distance error remains as high as 16.48 cm, severely undermining interaction reliability. Crucially, this study is the first to quantitatively characterize performance degradation under dynamic, close-range, and non-orthogonal HRI conditions. We further identify key failure modes and propose robust ensemble strategies, including multi-source fusion and optimized coordinate mapping, to enhance real-world deployment viability.

Technology Category

Application Category

📝 Abstract
This paper evaluates the current gaze estimation methods within an HRI context of a shared workspace scenario. We introduce a new, annotated dataset collected with the NICO robotic platform. We evaluate four state-of-the-art gaze estimation models. The evaluation shows that the angular errors are close to those reported on general-purpose benchmarks. However, when expressed in terms of distance in the shared workspace the best median error is 16.48 cm quantifying the practical limitations of current methods. We conclude by discussing these limitations and offering recommendations on how to best integrate gaze estimation as a modality in HRI systems.
Problem

Research questions and friction points this paper is trying to address.

Evaluating gaze estimation methods in human-robot shared workspace
Introducing annotated dataset collected with NICO robotic platform
Quantifying practical limitations through workspace distance errors
Innovation

Methods, ideas, or system contributions that make the work stand out.

Evaluated four state-of-the-art gaze estimation models
Introduced new annotated dataset using NICO platform
Quantified practical limitations with 16.48 cm median error
🔎 Similar Papers
No similar papers found.
M
Matej Palider
Faculty of Mathematics, Physics and Informatics, Comenius University, Bratislava
O
Omar Eldardeer
CONTACT UNIT, Italian Institute of Technology, Genoa
Viktor Kocur
Viktor Kocur
Assistant Professor, Comenius University
computer vision3D visiondeep learning