Multimodal Feedback for Handheld Tool Guidance: Combining Wrist-Based Haptics with Augmented Reality

📅 2026-01-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenges of reduced precision and user confidence in optical see-through augmented reality (AR) during handheld tool guidance, which are often caused by visual occlusion, illumination variations, and ambiguous interface cues. To overcome these limitations, the authors propose a multimodal guidance system that integrates AR visual feedback with wrist-worn haptic feedback, introducing for the first time directional and state-based vibrotactile cues tailored for surgical-grade tool manipulation. The system incorporates a reference mapping mechanism informed by surgeon preferences, a custom wrist-mounted haptic device, and a user-centered vibration encoding strategy. Experimental results demonstrate that the multimodal approach significantly outperforms unimodal conditions, achieving a spatial accuracy of 5.8 mm, a system usability score of 88.1, and notable reductions in cognitive load while enhancing user confidence during task execution.

Technology Category

Application Category

📝 Abstract
We investigate how vibrotactile wrist feedback can enhance spatial guidance for handheld tool movement in optical see-through augmented reality (AR). While AR overlays are widely used to support surgical tasks, visual occlusion, lighting conditions, and interface ambiguity can compromise precision and confidence. To address these challenges, we designed a multimodal system combining AR visuals with a custom wrist-worn haptic device delivering directional and state-based cues. A formative study with experienced surgeons and residents identified key tool maneuvers and preferences for reference mappings, guiding our cue design. In a cue identification experiment (N=21), participants accurately recognized five vibration patterns under visual load, with higher recognition for full-actuator states than spatial direction cues. In a guidance task (N=27), participants using both AR and haptics achieved significantly higher spatial precision (5.8 mm) and usability (SUS = 88.1) than those using either modality alone, despite having modest increases in task time. Participants reported that haptic cues provided reassuring confirmation and reduced cognitive effort during alignment. Our results highlight the promise of integrating wrist-based haptics into AR systems for high-precision, visually complex tasks such as surgical guidance. We discuss design implications for multimodal interfaces supporting confident, efficient tool manipulation.
Problem

Research questions and friction points this paper is trying to address.

augmented reality
haptic feedback
spatial guidance
multimodal interface
surgical tool manipulation
Innovation

Methods, ideas, or system contributions that make the work stand out.

multimodal feedback
wrist-based haptics
augmented reality
surgical guidance
spatial precision
🔎 Similar Papers
No similar papers found.
J
Johnny (Yue) Yang
Stanford University
Christoph Leuze
Christoph Leuze
Stanford University
Brian Hargreaves
Brian Hargreaves
Professor of Radiology, Stanford University
Magnetic Resonance Imaging
B
Bruce Daniel
Stanford University
F
Fred M Baik
Stanford University