π€ AI Summary
This work addresses the social disadvantages faced by blind and low-vision individuals in mixed-visual-ability interactions due to the absence of visual cues. To overcome the limitations of conventional assistive technologies that primarily support individual tasks, the authors propose CollabLensβa context-aware smart glasses system that facilitates equitable collaboration across vision-diverse groups by delivering real-time visual information and enabling context-sensitive interaction. Empirical findings demonstrate that CollabLens effectively extends the assistive network for visually impaired users and significantly enhances their sense of social inclusion. The study also uncovers cognitive uncertainties experienced by sighted partners when adapting their supportive behaviors, thereby offering a new paradigm for inclusive human-computer interaction design.
π Abstract
Meaningful social interaction is vital to well-being, yet Blind and Low Vision (BLV) individuals face persistent barriers when collaborating with sighted peers due to inaccessible visual cues. While most wearable assistive technologies emphasize individual tasks, smart glasses introduce opportunities for real-time, contextual support in social settings. To explore how smart glasses affect interpersonal dynamics and support inclusion in mixed-vision groups, we developed a smart glasses-based system, CollabLens, as a technology probe and employed it in four workshop sessions. We found that smart glasses can meaningfully support inclusive collaboration through expanding BLV participants' assistive networks with more flexible, independent access to visual information. While sighted participants viewed smart glasses as a promising medium that fosters interpersonal connection, they revealed uncertainty in adapting their helping behaviors. We concluded by discussing and synthesizing challenges and opportunities for designing smart glasses that provide seamless interaction experiences and enhance reciprocal mixed-vision social inclusion.