🤖 AI Summary
In mixed reality (MR) collaboration, a significant mapping gap exists between objective sensor signals—such as gaze, speech, and spatial motion—and users’ subjective collaborative experience, which is governed by implicit states including presence, cognitive usability, and social awareness. To bridge this gap, we propose the first Sensor-to-Subjective (S2S) mapping framework. It integrates multimodal sensing (eye tracking, speech analysis, and pose estimation), task log analytics, and interpretable regression modeling to systematically characterize the relationship between behavioral metrics and subjective experience. Evaluated in a controlled study with 48 participants across 12 MR collaborative tasks, the framework demonstrates that shared gaze and physical proximity significantly predict subjective collaboration perception (p < 0.01). This work establishes the first objective, signal-driven, and interpretable method for inferring subjective collaborative experience in MR, advancing both evaluation methodology and design guidance for immersive collaborative systems.
📝 Abstract
Mixed Reality (MR) enables rich, embodied collaboration, yet it's uncertain if sensor and system-logged behavioral signals capture how users experience that collaboration. This disconnect stems from a fundamental gap: behavioral signals are observable and continuous, while collaboration is interpreted subjectively, shaped by internal states like presence, cognitive availability, and social awareness. Our core insight is that sensor signals serve as observable manifestations of subjective experiences in MR collaboration, and they can be captured through sensor data such as shared gaze, speech, spatial movement, and other system-logged performance metrics. We propose the Sensor-to-Subjective (S2S) Mapping Framework, a conceptual model that links observable interaction patterns to users' subjective perceptions of collaboration and internal cognitive states through sensor-based indicators and task performance metrics. To validate this model, we conducted a study with 48 participants across 12 MR groups engaged in a collaborative image-sorting task. Our findings show a correlation between sensed behavior and perceived collaboration, particularly through shared attention and proximity.