๐ค AI Summary
This study addresses the challenges of coordinating physical objects with data visualizations in situated analytics by pioneering the integration of video see-through augmented reality (AR) and virtual reality (VR) within real-world physical environments. It systematically evaluates and enhances object-centric brushing and linking techniques through the introduction of more effective interactive highlighting methods. A comparative assessment of AR and VR reveals that AR generally outperforms VR in terms of task completion time and accuracy, although the effectiveness of each modality is moderated by specific contextual factors. The findings not only extend the applicability of situated brushing and linking but also underscore the critical influence of real-environment stimuli on analytical performance, offering novel insights for visualization design in mixed reality contexts.
๐ Abstract
In traditional visual analysis, brushing and linking is commonly used to visually connect multiple views using highlighting techniques. However, brushing and linking has rarely been used in situated analytics, which uses visualizations to analyze data in the context of physical referents. In situated analytics, data representations must be visually linked to real-world objects. Previous work has assessed situated brushing and linking in a virtual reality simulation of a supermarket scenario. Here, we replicate and extend the previous approach by studying brushing and linking in an actual physical space with augmented reality, while further improving the highlighting techniques. Using a video see-through display, we compare augmented reality with virtual reality. Results suggest that AR performs better in time and accuracy, but the effectiveness of the techniques varies by condition. These results provide a new framing of how the real-world stimuli matter in situated analytics.