🤖 AI Summary
Users widely perceive a loss of data control, stemming from existing privacy mechanisms that conflate two distinct types of control: interpersonal privacy (e.g., social sharing settings) and user-institutional privacy (e.g., platform-level data processing rights). This paper introduces and rigorously substantiates the “dichotomy of control objects,” arguing that current privacy design overemphasizes the former while systematically neglecting the latter. Through an interdisciplinary investigation—including controlled human-computer interaction experiments, comparative analysis of privacy policy texts, and critical information ethics inquiry—the study exposes the structural inadequacies of prevailing “one-size-fits-all” privacy interfaces. Findings inform the redesign of privacy interfaces, strengthen regulatory enforcement (e.g., GDPR compliance), and advance institutional innovations such as data trusts. Collectively, this work shifts the paradigm from procedural consent toward substantive user empowerment in data governance.
📝 Abstract
Users share a vast amount of data while using web and mobile applications. Most service providers such as email and social media providers provide users with privacy controls, which aim to give users the means to control what, how, when, and with whom, users share data. Nevertheless, it is not uncommon to hear users say that they feel they have lost control over their data on the web. This article aims to shed light on the often overlooked difference between two main types of privacy from a control perspective: privacy between a user and other users, and privacy between a user and institutions. We argue why this difference is important and what we need to do from here.