NAMET: Robust Massive Model Editing via Noise-Aware Memory Optimization

📅 2025-05-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) suffer significant performance degradation during large-scale knowledge editing—particularly when updating thousands of facts—primarily due to embedding-space collisions among knowledge items. To address this, we propose Noise-Aware Memory Optimization (NAMO), the first method to inject controllable noise into the memory retrieval phase of MEMIT, leveraging the inherent key-value separation in Transformer architectures to achieve collision-resilient editing. NAMO requires only a single-line code modification for integration and substantially improves editing robustness. Extensive experiments across six mainstream LLMs and three benchmark datasets demonstrate that, when editing thousands of factual statements, NAMO achieves an average 12.7% improvement in factual accuracy and maintains 91.4% contextual consistency. This work establishes a new paradigm for efficient, scalable, and reliable knowledge updating in LLMs.

Technology Category

Application Category

📝 Abstract
Model editing techniques are essential for efficiently updating knowledge in large language models (LLMs). However, the effectiveness of existing approaches degrades in massive editing scenarios, particularly when evaluated with practical metrics or in context-rich settings. We attribute these failures to embedding collisions among knowledge items, which undermine editing reliability at scale. To address this, we propose NAMET (Noise-aware Model Editing in Transformers), a simple yet effective method that introduces noise during memory extraction via a one-line modification to MEMIT. Extensive experiments across six LLMs and three datasets demonstrate that NAMET consistently outperforms existing methods when editing thousands of facts.
Problem

Research questions and friction points this paper is trying to address.

Improving reliability of large-scale model editing
Addressing embedding collisions in knowledge updates
Enhancing performance in context-rich editing scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Noise-aware memory optimization for robust editing
One-line modification to MEMIT for noise injection
Effective in massive editing of thousands facts
🔎 Similar Papers
No similar papers found.