See What I See: An Attention-Guiding eHMI Approach for Autonomous Vehicles

📅 2026-02-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses a critical limitation in current external human-machine interfaces (eHMIs) for autonomous vehicles, which predominantly convey only the vehicle’s own state and may inadvertently lead pedestrians to overlook surrounding environmental hazards. To mitigate this risk, the authors propose a novel projection-based attention-guiding eHMI (AGeHMI) that integrates an attention-guidance mechanism into eHMI design for the first time. By employing directional visual cues combined with risk-tiered color coding, AGeHMI actively directs pedestrians’ attention toward potential threats. Through a virtual reality user study leveraging projection-based visualization techniques, the research demonstrates that AGeHMI significantly improves the distribution of pedestrians’ visual attention, effectively reduces collision risk with surrounding vehicles, and simultaneously enhances subjective confidence while lowering cognitive load.

Technology Category

Application Category

📝 Abstract
As autonomous vehicles are gradually being deployed in the real world, external Human-Machine Interfaces (eHMIs) are expected to serve as a critical solution for enhancing vehicle-pedestrian communication. However, existing eHMI designs typically focus solely on the ego vehicle's status, which can inadvertently capture pedestrians' attention or encourage misguided reliance on the AV's signals, leading them to neglect scanning for other surrounding hazards. To address this, we propose the Attention-Guiding eHMI (AGeHMI), a projection-based visualization that employs directional cues and risk-based color coding to actively guide pedestrians' attention toward potential environmental dangers. Evaluation through a virtual reality user study (N = 20) suggests that AGeHMI effectively influences participants' visual attention distribution and significantly reduces potential collision risks with surrounding vehicles, while simultaneously improving subjective confidence and reducing cognitive workload.
Problem

Research questions and friction points this paper is trying to address.

autonomous vehicles
external Human-Machine Interface
pedestrian safety
attention guidance
hazard awareness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Attention-Guiding eHMI
autonomous vehicles
external Human-Machine Interface
visual attention guidance
risk-based color coding
Jialong Li
Jialong Li
Waseda University
self-adaptive systemsrequirement engineeringhuman-in-the-loop
Z
Zhenyu Mao
City University of Hong Kong, China
Z
Zhiyao Wang
The University of Osaka, Japan
Y
Yijun Lu
Waseda University, Japan
S
Shogo Morita
Institute of Science Tokyo, Japan
N
Nianyu Li
ZGC Laboratory, China
Kenji Tei
Kenji Tei
Institute of Science Tokyo
software architecturerequirement engineeringself-adaptive systemsformal verification