🤖 AI Summary
This study addresses the lack of inclusive design in external human-machine interfaces (eHMIs) for autonomous vehicles, which commonly overlook the communication needs of deaf and hard-of-hearing (DHH) individuals. It presents the first systematic investigation into eHMI usability for DHH users, employing focus group interviews, virtual reality simulations, eye-tracking, and subjective evaluations to compare visual and auditory eHMI modalities. Findings reveal that visual eHMIs significantly reduce crossing decision time and gaze duration for DHH participants while enhancing their trust, perceived safety, and perceived system usefulness, whereas auditory eHMIs yield limited effectiveness. Building on these insights, the study proposes five inclusive eHMI design principles tailored to DHH users, thereby addressing a critical gap in accessible intelligent transportation interaction research.
📝 Abstract
External Human-Machine Interfaces (eHMIs) have been proposed to facilitate communication between Automated Vehicles (AVs) and pedestrians. However, no attention was given to Deaf and Hard-of-Hearing (DHH) people. We conducted a formative study through focus groups with 6 DHH people and 6 key stakeholders (including researchers, assistive technologists, and automotive interface designers) to compare proposed eHMIs and extract key design requirements. Subsequently, we investigated the effects of visual and auditory eHMI in a virtual reality user study with 32 participants (16 DHH). Results from our scenario suggesting that (1) DHH participants spent more time looking at the AV; (2) both visual and auditory eHMIs enhanced trust, usefulness, and perceived safety; and (3) only visual eHMIs reduced the time to step into the road, time looking at the AV, gaze time, and percentage looking at active visual eHMI components. Lastly, we provided five practical implications for making eHMI inclusive of DHH people.