🤖 AI Summary
This study investigates how privacy safeguards influence public willingness to donate health data to for-profit versus non-profit organizations. Using a scenario-based survey experiment with 494 participants, we employ multivariate regression and causal inference to systematically evaluate the effects of four privacy-enhancing technologies (PETs)—data expiration, anonymization, purpose limitation, and access control—and compare the trust-building efficacy of self-auditing versus expert auditing. Our key contributions are: (1) empirical identification of a “privacy default trust” toward non-profits; (2) demonstration that PETs significantly mitigate the trust deficit associated with for-profits, narrowing the credibility gap between sectors; and (3) finding that expert auditing fails to significantly enhance trust—revealing a critical misalignment between technical assurances and public perception. These results challenge the prevailing assumption that audit transparency inherently increases trust, underscoring the need for interdisciplinary, socio-technical governance frameworks in health data stewardship.
📝 Abstract
The voluntary donation of private health information for altruistic purposes, such as supporting research advancements, is a common practice. However, concerns about data misuse and leakage may deter people from donating their information. Privacy Enhancement Technologies (PETs) aim to alleviate these concerns and in turn allow for safe and private data sharing. This study conducts a vignette survey (N=494) with participants recruited from Prolific to examine the willingness of US-based people to donate medical data for developing new treatments under four general guarantees offered across PETs: data expiration, anonymization, purpose restriction, and access control. The study explores two mechanisms for verifying these guarantees: self-auditing and expert auditing, and controls for the impact of confounds including demographics and two types of data collectors: for-profit and non-profit institutions. Our findings reveal that respondents hold such high expectations of privacy from non-profit entities a priori that explicitly outlining privacy protections has little impact on their overall perceptions. In contrast, offering privacy guarantees elevates respondents' expectations of privacy for for-profit entities, bringing them nearly in line with those for non-profit organizations. Further, while the technical community has suggested audits as a mechanism to increase trust in PET guarantees, we observe limited effect from transparency about such audits. We emphasize the risks associated with these findings and underscore the critical need for future interdisciplinary research efforts to bridge the gap between the technical community's and end-users' perceptions regarding the effectiveness of auditing PETs.