🤖 AI Summary
This study addresses a critical limitation in current robot navigation evaluation, which predominantly relies on bird’s-eye-view assessments that fail to capture the authentic social experience of close-proximity pedestrian encounters, thereby introducing perceptual bias. Through an immersive virtual reality experiment, the authors systematically compare users’ perceptions of social acceptability and intrusiveness of identical robot trajectories across three viewpoints: bird’s-eye, close-range first-person, and distant first-person perspectives. The work further investigates the role of head-motion cues—specifically nodding—in bridging perceptual gaps between these viewpoints. Results reveal for the first time that observation perspective significantly influences social perception: trajectories deemed friendly from a bird’s-eye view can appear more intrusive when experienced up close. Importantly, incorporating nodding gestures enhances perceived sociability and mitigates discomfort during proximate interactions, offering a novel strategy for designing socially aware robot navigation behaviors.
📝 Abstract
Ensuring that robot navigation is safe and socially acceptable is crucial for comfortable human-robot interaction in shared environments. However, existing validation methods often rely on a bird's-eye (allocentric) perspective, which fails to capture the subjective first-person experience of pedestrians encountering robots in the real world. In this paper, we address the perceptual gap between allocentric validation and egocentric experience by investigating how different perspectives affect the perceived sociability and disturbance of robot trajectories. Our approach uses an immersive VR environment to evaluate identical robot trajectories across allocentric, egocentric-proximal, and egocentric-distal viewpoints in a user study. We perform this analysis for trajectories generated from two different navigation policies to understand if the observed differences are unique to a single type of trajectory or more generalizable. We further examine whether augmenting a trajectory with a head-nod gesture can bridge the perceptual gap and improve human comfort. Our experiments suggest that trajectories rated as sociable from an allocentric view may be perceived as significantly more disturbing when experienced from a first-person perspective in close proximity. Our results also demonstrate that while passing distance affects perceived disturbance, communicative social signaling, such as a head-nod, can effectively enhance the perceived sociability of the robot's behavior.