Navigation beyond Wayfinding: Robots Collaborating with Visually Impaired Users for Environmental Interactions

📅 2026-03-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a human-robot collaborative navigation framework that extends traditional guide robot capabilities beyond path planning and obstacle avoidance to support physical interaction with the environment—such as pressing elevator buttons or pulling chairs—tasks essential yet previously unaddressed for visually impaired users. The approach introduces a dual-mode mechanism combining “guidance” and “adaptive” strategies, which dynamically integrates the robot’s high-precision perception and localization with the user’s operational intent to enable real-time motion coordination during interactive tasks. Experimental results demonstrate that the system significantly outperforms both conventional white canes and non-adaptive guidance approaches in terms of safety, fluency, and interaction efficiency, particularly excelling in high-precision manipulation scenarios.

Technology Category

Application Category

📝 Abstract
Robotic guidance systems have shown promise in supporting blind and visually impaired (BVI) individuals with wayfinding and obstacle avoidance. However, most existing systems assume a clear path and do not support a critical aspect of navigation - environmental interactions that require manipulating objects to enable movement. These interactions are challenging for a human-robot pair because they demand (i) precise localization and manipulation of interaction targets (e.g., pressing elevator buttons) and (ii) dynamic coordination between the user's and robot's movements (e.g., pulling out a chair to sit). We present a collaborative human-robot approach that combines our robotic guide dog's precise sensing and localization capabilities with the user's ability to perform physical manipulation. The system alternates between two modes: lead mode, where the robot detects and guides the user to the target, and adaptation mode, where the robot adjusts its motion as the user interacts with the environment (e.g., opening a door). Evaluation results show that our system enables navigation that is safer, smoother, and more efficient than both a traditional white cane and a non-adaptive guiding system, with the performance gap widening as tasks demand higher precision in locating interaction targets. These findings highlight the promise of human-robot collaboration in advancing assistive technologies toward more generalizable and realistic navigation support.
Problem

Research questions and friction points this paper is trying to address.

environmental interactions
visually impaired navigation
human-robot collaboration
object manipulation
assistive robotics
Innovation

Methods, ideas, or system contributions that make the work stand out.

human-robot collaboration
environmental interaction
adaptive guidance
assistive robotics
precise localization
🔎 Similar Papers
No similar papers found.