Privacy Perceptions in Robot-Assisted Well-Being Coaching: Examining the Roles of Information Transparency, User Control, and Proactivity

📅 2025-09-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates key determinants of user privacy perception in robot-assisted health counseling, focusing on information transparency, user control over personal data, and robot behavioral proactivity. Method: A controlled online experiment (N=200) systematically manipulated these three factors in factorial combinations to quantitatively assess users’ judgments of privacy appropriateness and trust. Contribution/Results: Neither increased transparency nor heightened robot proactivity alone significantly improved privacy trust. Only explicit, actionable user control—specifically over the explanation and sharing of personal health information—significantly enhanced both privacy appropriateness ratings (p<0.01) and trust. This is the first empirical demonstration of user control as a critical moderator in human-robot health interactions. The findings provide foundational evidence and actionable design principles for developing trustworthy, privacy-sensitive social robots in healthcare contexts.

Technology Category

Application Category

📝 Abstract
Social robots are increasingly recognized as valuable supporters in the field of well-being coaching. They can function as independent coaches or provide support alongside human coaches, and healthcare professionals. In coaching interactions, these robots often handle sensitive information shared by users, making privacy a relevant issue. Despite this, little is known about the factors that shape users' privacy perceptions. This research aims to examine three key factors systematically: (1) the transparency about information usage, (2) the level of specific user control over how the robot uses their information, and (3) the robot's behavioral approach - whether it acts proactively or only responds on demand. Our results from an online study (N = 200) show that even when users grant the robot general access to personal data, they additionally expect the ability to explicitly control how that information is interpreted and shared during sessions. Experimental conditions that provided such control received significantly higher ratings for perceived privacy appropriateness and trust. Compared to user control, the effects of transparency and proactivity on privacy appropriateness perception were low, and we found no significant impact. The results suggest that merely informing users or proactive sharing is insufficient without accompanying user control. These insights underscore the need for further research on mechanisms that allow users to manage robots' information processing and sharing, especially when social robots take on more proactive roles alongside humans.
Problem

Research questions and friction points this paper is trying to address.

Examining factors shaping privacy perceptions in robot-assisted well-being coaching
Investigating how information transparency, user control, and proactivity affect privacy
Addressing user expectations for controlling information interpretation and sharing by robots
Innovation

Methods, ideas, or system contributions that make the work stand out.

User control over robot information interpretation
Transparency and proactivity insufficient without control
Mechanisms needed for managing robot information sharing
A
Atikkhan Faridkhan Nilgar
Ubiquitous Computing, University of Siegen, Hoelderlinstrasse 3, 57076 Siegen, Germany
M
Manuel Dietrich
Honda Research Institute Europe GmbH, Carl-Legien-Strasse 30, 63073 Offenbach am Main, Germany
Kristof Van Laerhoven
Kristof Van Laerhoven
University of Siegen
activity recognitionwearable computingwearable sensorsembedded systemsmachine learning