🤖 AI Summary
This study addresses the heightened user privacy risk perceptions associated with smart personal assistants (SPAs) in smart homes when data traverses public networks and third-party boundaries, where existing anonymization techniques prove largely ineffective. Grounded in Privacy Boundary Theory, the research employs a mixed-methods approach—comprising 412 survey responses and 40 interviews—to systematically examine how users perceive risks across varying data types, transmission paths, and sharing scopes. It presents the first empirical application of Privacy Boundary Theory to the SPA context, revealing a nonlinear surge in risk perception at critical boundary transitions. Furthermore, the study identifies significant moderating effects of data attributes and contextual factors, offering empirical foundations for designing boundary-aware privacy mechanisms in intelligent home environments.
📝 Abstract
As Smart Home Personal Assistants (SPAs) evolve into social agents, understanding user privacy necessitates interpersonal communication frameworks, such as Privacy Boundary Theory (PBT). To ground our investigation, our three-phase preliminary study (1) identified transmission and sharing ranges as key boundary-related risk factors, (2) categorized relevant SPA functions and data types, and (3) analyzed commercial practices, revealing widespread data sharing and non-transparent safeguards. A subsequent mixed-methods study (N=412 survey, N=40 interviews among the survey participants) assessed users'perceived privacy risks across data types, transmission ranges and sharing ranges. Results demonstrate a significant, non-linear escalation in perceived risk when data crosses two critical boundaries: the `public network'(transmission) and `third parties'(sharing). This boundary effect holds robustly across data types and demographics. Furthermore, risk perception is modulated by data attributes (e.g., social relational data), and contextual privacy calculus. Conversely, anonymization safeguards show limited efficacy especially for third-party sharing, a finding attributed to user distrust. These findings empirically ground PBT in the SPA context and inform design of boundary-aware privacy protection.