🤖 AI Summary
This study addresses the challenges faced by deaf students in laboratory instruction—namely, attentional distraction, elevated cognitive load, and safety risks stemming from communication barriers. To mitigate these issues, the authors introduce ARRAE, a novel system that integrates augmented reality (AR) smart glasses to seamlessly overlay real-time sign language interpretation or captions directly within the user’s field of view, aligning auxiliary information with the experimental task space. The system leverages eye-tracking data and user-perceived assessments to foster an accessible AR communication ecosystem. Empirical results demonstrate that ARRAE significantly improves attention allocation and reduces cognitive load. Furthermore, the study identifies critical design considerations, including optimal display placement, visual fatigue, and compatibility with hearing-assistive devices, thereby establishing a new paradigm for inclusive laboratory education.
📝 Abstract
Deaf and hard of hearing (DHH) students often experience communication barriers in higher education, which are particularly acute in experiential learning environments such as laboratories. Traditional accessibility services, such as interpreting and captioning, often require DHH students to divide their attention between critical tasks, potential safety hazards, instructional materials, and access providers, creating trade-offs between safety and equitable communication. These demands can disrupt task engagement and increase cognitive load in settings that require sustained visual focus, highlighting the limitations of current approaches. To address these challenges, this study investigates Augmented Reality Real-Time Access for Education (ARRAE), an ecosystem based on augmented reality (AR) smart glasses, as a potential intervention for laboratory-based environments. By overlaying interpreters or captions directly into a student's field of view, AR enables the integration of accessibility into hands-on learning without compromising safety or comprehension. Through an empirical study with 12 DHH participants, we evaluate how AR-mediated access influences visual attention patterns and perceived cognitive load during hands-on tasks. The findings suggest that AR-mediated communication shows strong potential to improve attention management and communication accessibility in experiential learning environments, though participants emphasized that accessibility preferences are highly context-dependent. Participants also identified several design and ergonomic challenges, including display positioning, visual fatigue, and compatibility with hearing devices. Together, these results highlight both the promise of AR for supporting accessible participation in visually demanding environments and key design considerations for future systems.