🤖 AI Summary
This study addresses users’ lack of awareness regarding fine-grained physiological and behavioral data collection in immersive extended reality (XR) environments. Through a scenario-based survey with 464 XR users and mixed-methods analysis, we empirically demonstrate that over 60% of users are unaware of involuntary biometric data capture—such as unconscious emotional signals—rendering their privacy protection strategies ineffective. We identify how data type and sensitivity critically shape user perceptions across 18 distinct XR privacy scenarios. Based on these findings, we propose the first XR-specific privacy interface design framework and a set of transparent data practice guidelines. Our work provides empirically grounded, actionable principles for privacy-preserving human–XR interaction, bridging critical gaps between technical capability and user understanding in immersive computing.
📝 Abstract
Extended Reality (XR) technology is changing online interactions, but its granular data collection sensors may be more invasive to user privacy than web, mobile, and the Internet of Things technologies. Despite an increased interest in studying developers’ concerns about XR device privacy, user perceptions have rarely been addressed. We surveyed 464 XR users to assess their awareness, concerns, and coping strategies around XR data in 18 scenarios. Our findings demonstrate that many factors, such as data types and sensitivity, affect users’ perceptions of privacy in XR. However, users’ limited awareness of XR sensors’ granular data collection capabilities, such as involuntary body signals of emotional responses, restricted the range of privacy-protective strategies they used. Our results highlight a need to enhance users’ awareness of data privacy threats in XR, design privacy-choice interfaces tailored to XR environments, and develop transparent XR data practices.