Sight Guide: A Wearable Assistive Perception and Navigation System for the Vision Assistance Race in the Cybathlon 2024

📅 2025-06-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Visually impaired individuals face significant challenges in spatial awareness and semantic understanding during navigation in unknown environments. Method: This study proposes a lightweight, end-to-end wearable navigation system integrating RGB-D multi-camera input, embedded SLAM-based mapping, YOLOv8 object detection, PaddleOCR text recognition, and adaptive vibrotactile encoding feedback. Contribution/Results: The system achieves, for the first time in the Cybathlon 2024 Vision Assistance Race, synergistic integration of OCR-based text reading, touchscreen interaction, and semantics-aware obstacle avoidance. Leveraging voice commands and closed-loop multi-point vibrotactile guidance, it attains a 95.7% task success rate in highly dynamic real-world scenarios, completing all competition challenges. These results validate its robustness, real-time performance, and practical deployability, establishing a scalable technical paradigm for real-world accessible navigation.

Technology Category

Application Category

📝 Abstract
Visually impaired individuals face significant challenges navigating and interacting with unknown situations, particularly in tasks requiring spatial awareness and semantic scene understanding. To accelerate the development and evaluate the state of technologies that enable visually impaired people to solve these tasks, the Vision Assistance Race (VIS) at the Cybathlon 2024 competition was organized. In this work, we present Sight Guide, a wearable assistive system designed for the VIS. The system processes data from multiple RGB and depth cameras on an embedded computer that guides the user through complex, real-world-inspired tasks using vibration signals and audio commands. Our software architecture integrates classical robotics algorithms with learning-based approaches to enable capabilities such as obstacle avoidance, object detection, optical character recognition, and touchscreen interaction. In a testing environment, Sight Guide achieved a 95.7% task success rate, and further demonstrated its effectiveness during the Cybathlon competition. This work provides detailed insights into the system design, evaluation results, and lessons learned, and outlines directions towards a broader real-world applicability.
Problem

Research questions and friction points this paper is trying to address.

Develop wearable system for visually impaired navigation
Integrate obstacle avoidance and object detection technologies
Enhance real-world task performance for vision assistance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wearable system with RGB and depth cameras
Integrates robotics algorithms and learning-based approaches
Uses vibration and audio for user guidance
🔎 Similar Papers
No similar papers found.
Patrick Pfreundschuh
Patrick Pfreundschuh
PhD Candidate, Autonomous Systems Lab, ETH Zurich
RoboticsLiDAR OdometryPerceptionMapping
Giovanni Cioffi
Giovanni Cioffi
University of Zurich
RoboticsComputer Vision
C
C. V. Einem
Autonomous Systems Lab, ETH Zürich, CH.
A
Alexander Wyss
School of Engineering, Zurich University of Applied Sciences, CH.
H
H. W. V. D. Venn
School of Engineering, Zurich University of Applied Sciences, CH.
C
César Cadena
Robotics Systems Lab, ETH Zürich, CH.
D
D. Scaramuzza
Robotics and Perception Group, University of Zurich, CH.
R
R. Siegwart
Autonomous Systems Lab, ETH Zürich, CH.
Alireza Darvishy
Alireza Darvishy
Professor of Computer Science (ICT Accessibility), Zurich University of Applied Sciences
ICT AccessibilityHuman-Computer InteractionActive Assisted LivingAI-based assistive technologiesPDF Accessibility