STARC: See-Through-Wall Augmented Reality Framework for Human-Robot Collaboration in Emergency Response

πŸ“… 2025-09-18
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the challenge of degraded situational awareness in indoor emergency response due to occlusions, this paper proposes a through-wall augmented reality (AR) framework enabling human-robot collaboration. The framework fuses real-time LiDAR perception data from first responders with simultaneous localization and mapping (SLAM) outputs from ground robots. Leveraging cross-LiDAR relative pose estimation and dynamic map alignment, it achieves, for the first time, low-latency, geometrically consistent AR overlay of occluded persons and hazards from the responder’s mobile viewpoint. Key technical components include LiDAR-inertial odometry, 3D human detection, point cloud registration, and multi-source data fusion. Evaluated in simulation, laboratory, and tactical field experiments, the system demonstrates robust pose alignment and stable AR rendering, significantly improving situational awareness accuracy and operational safety in complex, occlusion-prone environments.

Technology Category

Application Category

πŸ“ Abstract
In emergency response missions, first responders must navigate cluttered indoor environments where occlusions block direct line-of-sight, concealing both life-threatening hazards and victims in need of rescue. We present STARC, a see-through AR framework for human-robot collaboration that fuses mobile-robot mapping with responder-mounted LiDAR sensing. A ground robot running LiDAR-inertial odometry performs large-area exploration and 3D human detection, while helmet- or handheld-mounted LiDAR on the responder is registered to the robot's global map via relative pose estimation. This cross-LiDAR alignment enables consistent first-person projection of detected humans and their point clouds - rendered in AR with low latency - into the responder's view. By providing real-time visualization of hidden occupants and hazards, STARC enhances situational awareness and reduces operator risk. Experiments in simulation, lab setups, and tactical field trials confirm robust pose alignment, reliable detections, and stable overlays, underscoring the potential of our system for fire-fighting, disaster relief, and other safety-critical operations. Code and design will be open-sourced upon acceptance.
Problem

Research questions and friction points this paper is trying to address.

Enhancing situational awareness in occluded emergency environments
Providing real-time visualization of hidden hazards and victims
Enabling human-robot collaboration through cross-LiDAR AR framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mobile-robot mapping fused with responder-mounted LiDAR
Cross-LiDAR alignment for consistent AR projection
Real-time visualization of hidden occupants and hazards
πŸ”Ž Similar Papers
No similar papers found.
S
Shenghai Yuan
School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798
W
Weixiang Guo
School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798
T
Tianxin Hu
School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798
Y
Yu Yang
School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798
Jinyu Chen
Jinyu Chen
The Hong Kong Polytechnic University
Edge/cloud computingVideo transmission.
R
Rui Qian
School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798
Zhongyuan Liu
Zhongyuan Liu
Tencent
AIGC Games
Lihua Xie
Lihua Xie
Professor of Electrical Engineering, Nanyang Technological University
Robust controlNetworked ControlMult-agent Systems