🤖 AI Summary
This study addresses the heightened privacy risks posed by generative AI (GenAI)-enabled smartphones, which deeply integrate large language models at the system level and continuously access sensitive user data—risks that significantly exceed those of conventional devices. Despite these concerns, user awareness and expectations regarding such systems remain unclear. Through 22 semi-structured interviews and focus groups, this work provides the first systematic investigation into users’ cognitive blind spots and primary privacy concerns across the entire data lifecycle. Findings reveal that users’ privacy apprehensions intensify markedly upon gaining a deeper understanding of the underlying technical mechanisms. Building on these insights, the study proposes a user-centered, collaborative privacy design framework that integrates system-level controls, data management practices, and transparency mechanisms to guide developers and regulators in safeguarding user privacy.
📝 Abstract
GenAI smartphones, which natively embed generative AI at the system level, are transforming mobile interactions by automating a wide range of tasks and executing UI actions on behalf of users. Their superior capabilities rely on continuous access to sensitive and context-rich data, raising privacy concerns that surpass those of traditional mobile devices. Yet, little is known about how users perceive the privacy implications of such devices or what safeguards they expect, which is especially critical at this early stage of GenAI smartphone adoption. To address this gap, we conduct 22 semi-structured interviews with everyday mobile users to explore their usage of GenAI smartphones, privacy concerns, and privacy design expectations. Our findings show that users engage with GenAI smartphones with limited understanding of how these systems operate to deliver functions, but show heightened privacy concerns once exposed to the technical details. Participants' concerns span the entire data lifecycle, including nontransparent collection, insecure storage, and weak data control. In a follow-up focus group, participants discuss a range of privacy-enhancing suggestions that call for coordinated changes across system-level controls, data management practices, and user-facing transparency. Their concerns and suggestions offer user-centered guidances for designing GenAI smartphones that balance functionality with privacy protection, offering valuable takeaways for system designers and regulators.