🤖 AI Summary
Existing contextual knowledge editing methods rely on static, surface-level similar demonstration examples, suffering from two key bottlenecks: an imbalanced trade-off between demonstration quantity and quality, and poor adaptability to varying task difficulty. This paper proposes DR-IKE—a gradient-free, dynamic retrieval-based contextual editing framework. DR-IKE employs a policy-optimized BERT retriever to dynamically select highly relevant demonstrations and introduces a learnable threshold to prune low-value samples. Jointly optimizing forward inference, it controls prompt length to simultaneously enhance editing accuracy and inference efficiency in black-box API settings. Evaluated on the COUNTERFACT benchmark, DR-IKE achieves up to a 17.1% improvement in edit success rate and a 41.6% reduction in latency, without compromising original task accuracy.
📝 Abstract
Large language models (LLMs) excel at factual recall yet still propagate stale or incorrect knowledge. In-context knowledge editing offers a gradient-free remedy suitable for black-box APIs, but current editors rely on static demonstration sets chosen by surface-level similarity, leading to two persistent obstacles: (i) a quantity-quality trade-off, and (ii) lack of adaptivity to task difficulty. We address these issues by dynamically selecting supporting demonstrations according to their utility for the edit. We propose Dynamic Retriever for In-Context Knowledge Editing (DR-IKE), a lightweight framework that (1) trains a BERT retriever with REINFORCE to rank demonstrations by editing reward, and (2) employs a learnable threshold to prune low-value examples, shortening the prompt when the edit is easy and expanding it when the task is hard. DR-IKE performs editing without modifying model weights, relying solely on forward passes for compatibility with black-box LLMs. On the COUNTERFACT benchmark, it improves edit success by up to 17.1%, reduces latency by 41.6%, and preserves accuracy on unrelated queries, demonstrating scalable and adaptive knowledge editing. The code is available at https://github.com/mwnafee/DR-IKE .