🤖 AI Summary
This study identifies critical trust bottlenecks hindering AI programming assistants’ effectiveness in generating privacy-compliant code—namely, developers’ misinterpretation of privacy requirements, insufficient knowledge of data protection regulations (e.g., GDPR, CCPA), and the unverifiability of model outputs. Drawing on a mixed-methods investigation involving 51 developers worldwide—including surveys, in-depth interviews, and empirical thematic analysis—it provides the first systematic characterization of the practice gap in AI-assisted privacy-aware coding. The contributions are twofold: (1) seven empirically grounded, actionable usage guidelines for developers, addressing prompt engineering, human-in-the-loop review, and context augmentation; and (2) four high-priority research directions—privacy-aware modeling, regulation-grounded explainable reasoning, verifiable code generation, and cross-jurisdictional compliance adaptation. Collectively, this work establishes an evidence-based foundation and operational framework to enhance the trustworthiness and practical utility of AI programming assistants in data protection contexts.
📝 Abstract
With the popularising of generative AI, the existence of AI-based programming assistants for developers is no surprise. Developers increasingly use them for their work, including generating code to fulfil the data protection requirements (privacy) of the apps they build. We wanted to know if the reality is the same as expectations of AI-based programming assistants when trying to fulfil software privacy requirements, and the challenges developers face when using AI-based programming assistants and how these can be improved. To this end, we conducted a survey with 51 developers worldwide. We found that AI-based programming assistants need to be improved in order for developers to better trust them with generating code that ensures privacy. In this paper, we provide some practical recommendations for developers to consider following when using AI-based programming assistants for privacy-related code development, and some key further research directions.