π€ AI Summary
This work addresses the limitations of existing retrieval-augmented generation (RAG) systems, which employ uniform anonymization strategies that struggle to balance privacy preservation with generation quality. The authors propose TRIP-RAG, a novel framework that introduces a context-aware dynamic anonymization mechanism. By jointly evaluating each entityβs marginal privacy risk, knowledge deviation, and topical relevance, TRIP-RAG enables fine-grained, differentiated privacy protection. This approach moves beyond the conventional βone-size-fits-allβ paradigm, achieving a privacy level comparable to full anonymization while substantially improving model utility: it incurs less than a 35% drop in Recall@k and boosts generation quality by up to 56% over current baselines.
π Abstract
Retrieval-Augmented Generation (RAG) enhances the utility of Large Language Models (LLMs) by retrieving external documents. Since the knowledge databases in RAG are predominantly utilized via cloud services, private data in sensitive domains such as finance and healthcare faces the risk of personal information leakage. Thus, effectively anonymizing knowledge bases is crucial for privacy preservation. Existing studies equate the privacy risk of text to the linear superposition of the privacy risks of individual, isolated sensitive entities. The "one-size-fits-all" full processing of all sensitive entities severely degrades utility of LLM. To address this issue, we introduce a dynamic anonymization framework named TRIP-RAG. Based on context-aware entity quantification, this framework evaluates entities from the perspectives of marginal privacy risk, knowledge divergence, and topical relevance. It identifies highly sensitive entities while trading off utility, providing a feasible approach for variable-intensity privacy protection scenarios. Our theoretical analysis and experiments indicate that TRIP-RAG can effectively reduce context inference risks. Extensive experimental results demonstrate that, while maintaining privacy protection comparable to full anonymization, TRIP-RAG's Recall@k decreases by less than 35% compared to the original data, and the generation quality improves by up to 56% over existing baselines.