Light My Way: Developing and Exploring a Multimodal Interface to Assist People With Visual Impairments to Exit Highly Automated Vehicles

📅 2025-01-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Blind and visually impaired persons (BVIPs) face significant safety and spatial awareness challenges when alighting from highly automated vehicles (HAVs) in unfamiliar environments. Method: Through participatory workshops with BVIPs, we identified core information needs and designed PathFinder—a multimodal alighting assistance system. We introduced the first adaptive multimodal interaction framework tailored to HAV alighting, integrating visual substitution, spatial audio, and haptic feedback while preserving contextual adaptivity and user autonomy. Employing user-centered design, low-fidelity prototyping, and a three-factor mixed-method experimental paradigm—including qualitative interviews and NASA-TLX workload assessment—we evaluated the system with 16 BVIP participants across urban and rural scenarios. Results: PathFinder significantly reduced cognitive workload and enhanced perceived safety in both settings; in urban environments, it notably improved safety perception and spatial orientation over an auditory-only baseline. Qualitative feedback strongly endorsed its usability and effectiveness.

Technology Category

Application Category

📝 Abstract
The introduction of Highly Automated Vehicles (HAVs) has the potential to increase the independence of blind and visually impaired people (BVIPs). However, ensuring safety and situation awareness when exiting these vehicles in unfamiliar environments remains challenging. To address this, we conducted an interactive workshop with N=5 BVIPs to identify their information needs when exiting an HAV and evaluated three prior-developed low-fidelity prototypes. The insights from this workshop guided the development of PathFinder, a multimodal interface combining visual, auditory, and tactile modalities tailored to BVIP's unique needs. In a three-factorial within-between-subject study with N=16 BVIPs, we evaluated PathFinder against an auditory-only baseline in urban and rural scenarios. PathFinder significantly reduced mental demand and maintained high perceived safety in both scenarios, while the auditory baseline led to lower perceived safety in the urban scenario compared to the rural one. Qualitative feedback further supported PathFinder's effectiveness in providing spatial orientation during exiting.
Problem

Research questions and friction points this paper is trying to address.

Autonomous Vehicles
Blind and Visually Impaired Persons
Environmental Perception
Innovation

Methods, ideas, or system contributions that make the work stand out.

PathFinder
Multisensory Feedback
Autonomous Vehicle Assistance
🔎 Similar Papers
No similar papers found.
Luca-Maxim Meinhardt
Luca-Maxim Meinhardt
Research Associate, Institute of Media Informatics, Ulm University
Human-Computer-Interactionfuture mobilityinfinite scrollingurban air mobility
L
Lina Wilke
Institute of Media Informatics, Ulm University, Ulm, Germany
M
Maryam Elhaidary
Institute of Media Informatics, Ulm University, Ulm, Germany
J
Julia von Abel
Institute of Media Informatics, Ulm, Germany
P
Paul Fink
The University of Maine, Maine, US
Michael Rietzler
Michael Rietzler
Ulm University
HCI
Mark Colley
Mark Colley
University College London
Automated DrivingAugmented RealityDriver-Vehicle InteractionAccessibilityVirtual Reality
Enrico Rukzio
Enrico Rukzio
Ulm University
Human-Computer-Interaction