🤖 AI Summary
Blind users face significant accessibility barriers in VR due to difficulties perceiving spatial information—such as direction, distance, and motion—of virtual objects.
Method: This study presents the first systematic comparison, conducted with blind participants, of two tactile feedback modalities—dorsal-hand vibration versus skin stretch—for spatial information conveyance. We developed a custom dorsal-hand haptic device, a VR spatial rendering engine, and a dual-modality actuation control system, and evaluated performance using standardized experimental protocols with 10 blind participants.
Contribution/Results: Skin stretch feedback significantly outperformed vibration: spatial position identification accuracy improved by 37%, and motion trajectory discrimination accuracy increased by 42%. Based on these findings, we propose evidence-based design guidelines for skin stretch haptics tailored to accessible VR. This work establishes a novel paradigm for high-fidelity spatial haptic interaction and provides empirical foundations for inclusive VR interface design.
📝 Abstract
Perceiving spatial information of a virtual object (e.g, direction, distance) is critical yet challenging for blind users seeking an immersive virtual reality (VR) experience. To facilitate VR accessibility for blind users, in this paper, we investigate the effectiveness of two types of haptic cues - vibrotactile and skin-stretch cues - in conveying the spatial information of a virtual object when applied to the dorsal side of a blind user's hand. We conducted a user study with 10 blind users to investigate how they perceive static and moving objects in VR with a custom-made haptic apparatus. Our results reveal that blind users can more accurately understand an object's location and movement when receiving skin-stretch cues, as opposed to vibrotactile cues. We discuss the pros and cons of both types of haptic cues and conclude with design recommendations for future haptic solutions for VR accessibility.