Zero-shot Cross-lingual NER via Mitigating Language Difference: An Entity-aligned Translation Perspective

📅 2025-09-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing zero-shot cross-lingual named entity recognition (ZCL-NER) methods suffer significant performance degradation on non-Latin-script languages (e.g., Chinese, Japanese), primarily due to deep linguistic structural disparities that impede effective knowledge transfer. To address this, we propose an LLM-based entity-aligned translation framework. Our method introduces: (1) a bidirectional translation mechanism—forward (source→target) and back-translation—that explicitly models cross-lingual entity alignment and semantic consistency; and (2) fine-tuning of multilingual LLMs on Wikipedia-derived parallel entity data to enhance cross-script and cross-structural entity mapping capabilities. Experiments across multiple low-resource non-Latin languages demonstrate substantial improvements in zero-shot transfer performance. The approach effectively mitigates negative transfer induced by typological divergence, offering a novel paradigm for generalizing ZCL-NER across morphologically and structurally dissimilar languages.

Technology Category

Application Category

📝 Abstract
Cross-lingual Named Entity Recognition (CL-NER) aims to transfer knowledge from high-resource languages to low-resource languages. However, existing zero-shot CL-NER (ZCL-NER) approaches primarily focus on Latin script language (LSL), where shared linguistic features facilitate effective knowledge transfer. In contrast, for non-Latin script language (NSL), such as Chinese and Japanese, performance often degrades due to deep structural differences. To address these challenges, we propose an entity-aligned translation (EAT) approach. Leveraging large language models (LLMs), EAT employs a dual-translation strategy to align entities between NSL and English. In addition, we fine-tune LLMs using multilingual Wikipedia data to enhance the entity alignment from source to target languages.
Problem

Research questions and friction points this paper is trying to address.

Addressing performance degradation in zero-shot cross-lingual NER for non-Latin script languages
Mitigating structural language differences between Latin and non-Latin script languages
Improving entity alignment through translation strategies and LLM fine-tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Entity-aligned translation using LLMs
Dual-translation strategy for alignment
Fine-tuning with multilingual Wikipedia data
🔎 Similar Papers
No similar papers found.