π€ AI Summary
To address the challenge of degraded situational awareness in indoor emergency response due to occlusions, this paper proposes a through-wall augmented reality (AR) framework enabling human-robot collaboration. The framework fuses real-time LiDAR perception data from first responders with simultaneous localization and mapping (SLAM) outputs from ground robots. Leveraging cross-LiDAR relative pose estimation and dynamic map alignment, it achieves, for the first time, low-latency, geometrically consistent AR overlay of occluded persons and hazards from the responderβs mobile viewpoint. Key technical components include LiDAR-inertial odometry, 3D human detection, point cloud registration, and multi-source data fusion. Evaluated in simulation, laboratory, and tactical field experiments, the system demonstrates robust pose alignment and stable AR rendering, significantly improving situational awareness accuracy and operational safety in complex, occlusion-prone environments.
π Abstract
In emergency response missions, first responders must navigate cluttered indoor environments where occlusions block direct line-of-sight, concealing both life-threatening hazards and victims in need of rescue. We present STARC, a see-through AR framework for human-robot collaboration that fuses mobile-robot mapping with responder-mounted LiDAR sensing. A ground robot running LiDAR-inertial odometry performs large-area exploration and 3D human detection, while helmet- or handheld-mounted LiDAR on the responder is registered to the robot's global map via relative pose estimation. This cross-LiDAR alignment enables consistent first-person projection of detected humans and their point clouds - rendered in AR with low latency - into the responder's view. By providing real-time visualization of hidden occupants and hazards, STARC enhances situational awareness and reduces operator risk. Experiments in simulation, lab setups, and tactical field trials confirm robust pose alignment, reliable detections, and stable overlays, underscoring the potential of our system for fire-fighting, disaster relief, and other safety-critical operations. Code and design will be open-sourced upon acceptance.