Impact of Target and Tool Visualization on Depth Perception and Usability in Optical See-Through AR

๐Ÿ“… 2025-08-25
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
In optical see-through augmented reality (OST-AR) systems (e.g., HoloLens 2), depth perception of virtual targets and absence of real-tool occlusion critically hinder armโ€™s-length interaction tasks such as surgical navigation. To address this, we conducted two controlled user studies systematically evaluating the effects of target opacity (0%โ€“100%) and tool visualization strategies (real tool visible, virtual proxy, no tracking) on depth estimation accuracy, localization error, subjective workload, and usability. Results show that rendering the real tool enhances depth matching accuracy, minimizes localization error, maximizes usability, and reduces cognitive loadโ€”whereas omitting tool tracking yields the worst performance. Notably, moderate reduction in target opacity (i.e., partial, not full, transparency) significantly improves depth estimation. Crucially, this work provides the first empirical evidence that real-time, geometrically consistent occlusion cues from the physical tool are more critical for depth perception than target transparency alone. These findings establish actionable rendering and tracking design guidelines for near-field OST-AR interaction.

Technology Category

Application Category

๐Ÿ“ Abstract
Optical see-through augmented reality (OST-AR) systems like Microsoft HoloLens 2 hold promise for arm's distance guidance (e.g., surgery), but depth perception of the hologram and occlusion of real instruments remain challenging. We present an evaluation of how visualizing the target object with different transparencies and visualizing a tracked tool (virtual proxy vs. real tool vs. no tool tracking) affects depth perception and system usability. Ten participants performed two experiments on HoloLens 2. In Experiment 1, we compared high-transparency vs. low-transparency target rendering in a depth matching task at arm's length. In Experiment 2, participants performed a simulated surgical pinpoint task on a frontal bone target under six visualization conditions ($2 imes 3$: two target transparencies and three tool visualization modes: virtual tool hologram, real tool, or no tool tracking). We collected data on depth matching error, target localization error, system usability, task workload, and qualitative feedback. Results show that a more opaque target yields significantly lower depth estimation error than a highly transparent target at arm's distance. Moreover, showing the real tool (occluding the virtual target) led to the highest accuracy and usability with the lowest workload, while not tracking the tool yielded the worst performance and user ratings. However, making the target highly transparent, while allowing the real tool to remain visible, slightly impaired depth cues and did not improve usability. Our findings underscore that correct occlusion cues, rendering virtual content opaque and occluding it with real tools in real time, are critical for depth perception and precision in OST-AR. Designers of arm-distance AR systems should prioritize robust tool tracking and occlusion handling; if unavailable, cautiously use transparency to balance depth perception and tool visibility.
Problem

Research questions and friction points this paper is trying to address.

Evaluating target transparency impact on depth perception in AR
Assessing tool visualization methods for system usability
Investigating occlusion cues for precision in surgical AR
Innovation

Methods, ideas, or system contributions that make the work stand out.

Opaque target rendering reduces depth error
Real tool occlusion improves accuracy and usability
Robust tool tracking and occlusion handling critical
๐Ÿ”Ž Similar Papers
No similar papers found.
Y
Yue Yang
Stanford University
X
Xue Xie
SJTU
Xinkai Wang
Xinkai Wang
Southeast University
Embodied AILLM reasoning
H
Hui Zhang
HUST
C
Chiming Yu
HUST
X
Xiaoxian Xiong
SJTU
L
Lifeng Zhu
SEU
Y
Yuanyi Zheng
SJTU
J
Jue Cen
SJTU
B
Bruce Daniel
Stanford University
Fred Baik
Fred Baik
Assistant Professor of Otolaryngology - Head & Neck Surgery, Stanford University
head and neck cancermicrovascular reconstructionfluorescenceaugmented reality