When Refreshable Tactile Displays Meet Conversational Agents: Investigating Accessible Data Presentation and Analysis with Touch and Speech

📅 2024-08-09
🏛️ IEEE Transactions on Visualization and Computer Graphics
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited data chart comprehension and analytical capabilities among blind and low-vision (BLV) users, this study investigates the feasibility of synergistically integrating refreshable tactile displays (RTDs) with voice-based conversational agents to support interactive data exploration. Through a Wizard-of-Oz user study employing tactile representations of line charts, bar charts, and contour plots, we identified nine distinct tactile–voice hybrid interaction patterns. This work is the first to empirically reveal how task type and prior tactile experience influence modality selection. Results demonstrate that multimodal integration significantly enhances independent comprehension and deep cognitive engagement; notably, users with higher tactile expertise exhibit stronger preference for autonomous data interpretation. Based on these findings, we derive empirically grounded design principles for multimodal accessible data interfaces—providing both theoretical foundations and practical design paradigms for developing data analysis tools tailored to BLV populations.

Technology Category

Application Category

📝 Abstract
Despite the recent surge of research efforts to make data visualizations accessible to people who are blind or have low vision (BLV), how to support BLV people's data analysis remains an important and challenging question. As refreshable tactile displays (RTDs) become cheaper and conversational agents continue to improve, their combination provides a promising approach to support BLV people's interactive data exploration and analysis. To understand how BLV people would use and react to a system combining an RTD with a conversational agent, we conducted a Wizard-of-Oz study with 11 BLV participants, where they interacted with line charts, bar charts, and isarithmic maps. Our analysis of participants' interactions led to the identification of nine distinct patterns. We also learned that the choice of modalities depended on the type of task and prior experience with tactile graphics, and that participants strongly preferred the combination of RTD and speech to a single modality. In addition, participants with more tactile experience described how tactile images facilitated a deeper engagement with the data and supported independent interpretation. Our findings will inform the design of interfaces for such interactive mixed-modality systems.
Problem

Research questions and friction points this paper is trying to address.

Data Accessibility
Visual Impairment
Analytics Support
Innovation

Methods, ideas, or system contributions that make the work stand out.

Haptic Displays
Multimodal Data Exploration
Assistive Technology for Visually Impaired
🔎 Similar Papers
No similar papers found.