From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication

📅 2026-03-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the need for enhanced semantic understanding and spatial awareness in robotic guide dogs operating in dynamic environments for visually impaired users. It proposes a novel dialogue system that integrates a large language model (LLM) to unify natural language generation with real-time environmental perception and navigation planning—a first in this domain. The system translates scene context and path-planning decisions into interpretable natural language, enabling collaborative human–robot decision-making. Through user studies and simulation experiments, the study validates the effectiveness of diverse linguistic expression strategies, demonstrating significant improvements in both interaction efficiency and navigation accuracy.

Technology Category

Application Category

📝 Abstract
Assistive robotics is an important subarea of robotics that focuses on the well-being of people with disabilities. A robotic guide dog is an assistive quadruped robot that helps visually impaired people in obstacle avoidance and navigation. Enabling language capabilities for robotic guide dogs goes beyond naively adding an existing dialog system onto a mobile robot. The novel challenges include grounding language in the dynamically changing environment and improving spatial awareness for the human handler. To address those challenges, we develop a novel dialog system for robotic guide dogs that uses LLMs to verbalize both navigational plans and scenes. The goal is to enable verbal communication for collaborative decision-making within the handler-robot team. In experiments, we conducted a human study to evaluate different verbalization strategies and a simulation study to assess the efficiency and accuracy in navigation tasks.
Problem

Research questions and friction points this paper is trying to address.

robotic guide dogs
verbal communication
language grounding
spatial awareness
assistive robotics
Innovation

Methods, ideas, or system contributions that make the work stand out.

robotic guide dog
verbal communication
language grounding
large language models
assistive robotics
🔎 Similar Papers
No similar papers found.
Yohei Hayamizu
Yohei Hayamizu
SUNY Binghamton
Artificial IntelligenceRoboticsReinforcement LearningDialog Navigation Systems
David DeFazio
David DeFazio
Graduate Student, Binghamton University
RoboticsReinforcement LearningGraph Neural Networks
H
Hrudayangam Mehta
The State University of New York at Binghamton
Z
Zainab Altaweel
The State University of New York at Binghamton
J
Jacqueline Choe
The State University of New York at Binghamton
C
Chao Lin
The State University of New York at Binghamton
J
Jake Juettner
The State University of New York at Binghamton
F
Furui Xiao
The State University of New York at Binghamton
Jeremy Blackburn
Jeremy Blackburn
Associate Professor, Binghamton University School of Computing
scoreboardologycybersafetycomputational social science
Shiqi Zhang
Shiqi Zhang
Associate Professor of Computer Science, SUNY Binghamton
RoboticsArtificial Intelligence