ARCAS: An Augmented Reality Collision Avoidance System with SLAM-Based Tracking for Enhancing VRU Safety

📅 2025-12-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Vulnerable road users (VRUs) face high collision risks in mixed-traffic environments, yet existing safety systems lack direct, real-time support for them. Method: This work proposes the first LiDAR-driven, real-time augmented reality (AR) collision warning system specifically designed for VRUs. It integrates roadside 360° 3D LiDAR perception, head-mounted display (HMD)-based SLAM pose tracking, and automatic 3D calibration to render georeferenced dynamic 3D bounding boxes and directional arrows within the user’s AR field of view. A novel multi-HMD shared spatial anchor coordination mechanism ensures cross-device AR spatial consistency. Results: Evaluated across 180 real-world traffic interactions, the system increased time-to-collision for pedestrians by an average of 97% and improved third-party reaction margins by up to 4× compared to unassisted (naked-eye) conditions—providing the first empirical evidence of significant safety gains enabled by LiDAR-AR guidance for VRUs.

Technology Category

Application Category

📝 Abstract
Vulnerable road users (VRUs) face high collision risks in mixed traffic, yet most existing safety systems prioritize driver or vehicle assistance over direct VRU support. This paper presents ARCAS, a real-time augmented reality collision avoidance system that provides personalized spatial alerts to VRUs via wearable AR headsets. By fusing roadside 360-degree 3D LiDAR with SLAM-based headset tracking and an automatic 3D calibration procedure, ARCAS accurately overlays world-locked 3D bounding boxes and directional arrows onto approaching hazards in the user's passthrough view. The system also enables multi-headset coordination through shared world anchoring. Evaluated in real-world pedestrian interactions with e-scooters and vehicles (180 trials), ARCAS nearly doubled pedestrians' time-to-collision and increased counterparts' reaction margins by up to 4x compared to unaided-eye conditions. Results validate the feasibility and effectiveness of LiDAR-driven AR guidance and highlight the potential of wearable AR as a promising next-generation safety tool for urban mobility.
Problem

Research questions and friction points this paper is trying to address.

Develops AR system for VRU safety with real-time alerts
Uses LiDAR and SLAM for accurate hazard visualization
Enhances collision avoidance in mixed traffic via AR guidance
Innovation

Methods, ideas, or system contributions that make the work stand out.

SLAM-based AR headset tracking for real-time hazard overlay
Fusion of roadside LiDAR with wearable AR for spatial alerts
Multi-headset coordination via shared world anchoring system
🔎 Similar Papers
No similar papers found.
A
Ahmad Yehia
Department of Civil, Architectural, and Environmental Engineering, The University of Texas at Austin, Austin, TX 78712, USA
J
Jiseop Byeon
Department of Civil, Architectural, and Environmental Engineering, The University of Texas at Austin, Austin, TX 78712, USA
T
Tianyi Wang
Department of Civil, Architectural, and Environmental Engineering, The University of Texas at Austin, Austin, TX 78712, USA
H
Huihai Wang
School of Architecture, The University of Texas at Austin, Austin, TX 78712, USA
Y
Yiming Xu
School of Architecture, The University of Texas at Austin, Austin, TX 78712, USA
Junfeng Jiao
Junfeng Jiao
Associate Professor, Urban Information Lab, Texas Smart City, NSF NRT AI, UT Austin
AISmart CityUrban Informatics
Christian Claudel
Christian Claudel
UT Austin
Wireless sensor networkstransportation engineering