🤖 AI Summary
This work uncovers pervasive privacy risks arising from excessive user data collection by third-party tools (e.g., GPT Actions) in LLM application ecosystems, using the OpenAI ecosystem as a case study. Method: We propose the first LLM-driven natural language specification parsing framework that integrates natural language understanding (NLU), rule-based compliance checking, and structured privacy policy analysis to automatically identify data collection practices and assess their alignment with platform policies. Contribution/Results: We systematically annotate 145 data types across 24 categories; findings reveal that 6.03% of collection instances exceed necessity—particularly sensitive data such as passwords—and only 5.8% of tools explicitly disclose their data practices. Empirical evaluation demonstrates high accuracy in policy-consistency detection. Our framework establishes a scalable methodology and an empirical benchmark for privacy governance in LLM-powered applications.
📝 Abstract
LLM app (tool) ecosystems are rapidly evolving to support sophisticated use cases that often require extensive user data collection. Given that LLM apps are developed by third parties and anecdotal evidence indicating inconsistent enforcement of policies by LLM platforms, sharing user data with these apps presents significant privacy risks. In this paper, we aim to bring transparency in data practices of LLM app ecosystems. We examine OpenAI's GPT app ecosystem as a case study. We propose an LLM-based framework to analyze the natural language specifications of GPT Actions (custom tools) and assess their data collection practices. Our analysis reveals that Actions collect excessive data across 24 categories and 145 data types, with third-party Actions collecting 6.03% more data on average. We find that several Actions violate OpenAI's policies by collecting sensitive information, such as passwords, which is explicitly prohibited by OpenAI. Lastly, we develop an LLM-based privacy policy analysis framework to automatically check the consistency of data collection by Actions with disclosures in their privacy policies. Our measurements indicate that the disclosures for most of the collected data types are omitted, with only 5.8% of Actions clearly disclosing their data collection practices.