🤖 AI Summary
This study investigates whether developers’ difficulties in complying with the General Data Protection Regulation (GDPR) stem primarily from knowledge gaps or insufficient tooling and support. Method: We conducted a 5-hour contextual programming experiment with 30 professional developers, employing a three-condition controlled design (control, prompt-based assistance, expert support), complemented by task submission analysis and semi-structured interviews. Contribution/Results: Only 12.2% of 90 implemented features complied fully with GDPR—particularly violating purpose limitation and user consent principles. Developers exhibited counterintuitive behavioral patterns: prioritizing functional correctness over privacy, rarely seeking help, and seldom consulting privacy documentation. Compliance improved only marginally under prompting (20%) or expert support (26.7%). Notably, 87% lacked confidence in their GDPR implementation yet seldom verified correctness. The findings reveal that the core challenge lies not in knowledge deficits per se, but in cognitive inertia and misalignment between developers’ mental models and existing support mechanisms—providing empirical grounding for privacy-enhancing tool design and privacy-aware software engineering education.
📝 Abstract
While protecting user data is essential, software developers often fail to fulfill privacy requirements. However, the reasons why they struggle with privacy-compliant implementation remain unclear. Is it due to a lack of knowledge, or is it because of insufficient support? To provide foundational insights in this field, we conducted a qualitative 5-hour programming study with 30 professional software developers implementing 3 privacy-sensitive programming tasks that were designed with GDPR compliance in mind. To explore if and how developers implement privacy requirements, participants were divided into 3 groups: control, privacy prompted, and privacy expert-supported. After task completion, we conducted follow-up interviews. Alarmingly, almost all participants submitted non-GDPR-compliant solutions (79/90). In particular, none of the 3 tasks were solved privacy-compliant by all 30 participants, with the non-prompted group having the lowest number of 3 out of 30 privacy-compliant solution attempts. Privacy prompting and expert support only slightly improved participants' submissions, with 6/30 and 8/30 privacy-compliant attempts, respectively. In fact, all participants reported severe issues addressing common privacy requirements such as purpose limitation, user consent, or data minimization. Counterintuitively, although most developers exhibited minimal confidence in their solutions, they rarely sought online assistance or contacted the privacy expert, with only 4 out of 10 expert-supported participants explicitly asking for compliance confirmation. Instead, participants often relied on existing implementations and focused on implementing functionality and security first.