What Sensors See, What People Feel: Exploring Subjective Collaboration Perception in Mixed Reality

📅 2025-04-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In mixed reality (MR) collaboration, a significant mapping gap exists between objective sensor signals—such as gaze, speech, and spatial motion—and users’ subjective collaborative experience, which is governed by implicit states including presence, cognitive usability, and social awareness. To bridge this gap, we propose the first Sensor-to-Subjective (S2S) mapping framework. It integrates multimodal sensing (eye tracking, speech analysis, and pose estimation), task log analytics, and interpretable regression modeling to systematically characterize the relationship between behavioral metrics and subjective experience. Evaluated in a controlled study with 48 participants across 12 MR collaborative tasks, the framework demonstrates that shared gaze and physical proximity significantly predict subjective collaboration perception (p < 0.01). This work establishes the first objective, signal-driven, and interpretable method for inferring subjective collaborative experience in MR, advancing both evaluation methodology and design guidance for immersive collaborative systems.

Technology Category

Application Category

📝 Abstract
Mixed Reality (MR) enables rich, embodied collaboration, yet it's uncertain if sensor and system-logged behavioral signals capture how users experience that collaboration. This disconnect stems from a fundamental gap: behavioral signals are observable and continuous, while collaboration is interpreted subjectively, shaped by internal states like presence, cognitive availability, and social awareness. Our core insight is that sensor signals serve as observable manifestations of subjective experiences in MR collaboration, and they can be captured through sensor data such as shared gaze, speech, spatial movement, and other system-logged performance metrics. We propose the Sensor-to-Subjective (S2S) Mapping Framework, a conceptual model that links observable interaction patterns to users' subjective perceptions of collaboration and internal cognitive states through sensor-based indicators and task performance metrics. To validate this model, we conducted a study with 48 participants across 12 MR groups engaged in a collaborative image-sorting task. Our findings show a correlation between sensed behavior and perceived collaboration, particularly through shared attention and proximity.
Problem

Research questions and friction points this paper is trying to address.

Exploring if sensor signals reflect user collaboration experience in MR
Bridging gap between observable behavior and subjective collaboration perception
Validating correlation between sensed behavior and perceived collaboration in MR
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sensor-to-Subjective (S2S) Mapping Framework
Links interaction patterns to subjective perceptions
Uses shared gaze, speech, movement metrics
🔎 Similar Papers
No similar papers found.