Persuasion Tokens for Editing Factual Knowledge in LLMs

📅 2026-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes “persuasion tokens” (P-Tokens)—learnable, universal tokens that replace fact-specific examples in in-context knowledge editing (IKE). By leveraging end-to-end training, P-Tokens enable efficient knowledge injection without consuming additional context length, addressing the high cost and excessive context window usage of conventional IKE methods. The approach supports compositional optimization through multiple P-Tokens and achieves performance on par with or superior to IKE across two knowledge editing benchmarks and three large language models. Furthermore, it demonstrates robustness against distractors and exhibits scalable improvements as the number of P-Tokens increases, significantly enhancing both the efficiency and scalability of knowledge editing.

Technology Category

Application Category

📝 Abstract
In-context knowledge editing (IKE) is a promising technique for updating Large Language Models (LLMs) with new information. However, IKE relies on lengthy, fact-specific demonstrations which are costly to create and consume significant context window space. In this paper, we introduce persuasion tokens (P-Tokens) -- special tokens trained to replicate the effect of IKE demonstrations, enabling efficient knowledge editing without requiring fact-specific demonstrations. We evaluate P-Tokens across two editing datasets and three LLMs, demonstrating performance comparable to, and often exceeding, IKE. We further find that editing performance is robust to distractors with small negative effects to neighboring facts, and that increasing the number of P-Tokens improves performance. Our work addresses key limitations of IKE and provides a more practical and scalable alternative for editing LLMs.
Problem

Research questions and friction points this paper is trying to address.

in-context knowledge editing
Large Language Models
factual knowledge editing
context window efficiency
knowledge updating
Innovation

Methods, ideas, or system contributions that make the work stand out.

Persuasion Tokens
In-context Knowledge Editing
Large Language Models
Knowledge Editing
Efficient Prompting
🔎 Similar Papers
No similar papers found.