Enhancing Cross-Lingual Transfer through Reversible Transliteration: A Huffman-Based Approach for Low-Resource Languages

📅 2025-09-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Low-resource languages—particularly those using non-Latin scripts—suffer from poor cross-lingual transfer performance in large language models (LLMs), while existing transliteration methods lack systematic integration into model training and inference. To address this, we propose the first lightweight input representation framework that tightly integrates reversible transliteration with Huffman coding. Our method achieves lossless, fully invertible transliteration compression at the character level, requiring no vocabulary expansion or additional model parameters. Its key innovation lies in the first application of Huffman coding to compress transliterated sequences, simultaneously improving storage efficiency (50% reduction in file size), computational efficiency (50–80% fewer tokens), and multilingual scalability. Extensive experiments on text classification, machine reading comprehension, and machine translation demonstrate substantial performance gains for low-resource languages, without compromising accuracy on high-resource languages.

Technology Category

Application Category

📝 Abstract
As large language models (LLMs) are trained on increasingly diverse and extensive multilingual corpora, they demonstrate cross-lingual transfer capabilities. However, these capabilities often fail to effectively extend to low-resource languages, particularly those utilizing non-Latin scripts. While transliterating low-resource languages into Latin script presents a natural solution, there currently lacks a comprehensive framework for integrating transliteration into LLMs training and deployment. Taking a pragmatic approach, this paper innovatively combines character transliteration with Huffman coding to design a complete transliteration framework. Our proposed framework offers the following advantages: 1) Compression: Reduces storage requirements for low-resource language content, achieving up to 50% reduction in file size and 50-80% reduction in token count. 2) Accuracy: Guarantees 100% lossless conversion from transliterated text back to the source language. 3) Efficiency: Eliminates the need for vocabulary expansion for low-resource languages, improving training and inference efficiency. 4) Scalability: The framework can be extended to other low-resource languages. We validate the effectiveness of our framework across multiple downstream tasks, including text classification, machine reading comprehension, and machine translation. Experimental results demonstrate that our method significantly enhances the model's capability to process low-resource languages while maintaining performance on high-resource languages. Our data and code are publicly available at https://github.com/CMLI-NLP/HuffmanTranslit.
Problem

Research questions and friction points this paper is trying to address.

Improving cross-lingual transfer for low-resource non-Latin script languages
Developing reversible transliteration framework using Huffman coding approach
Reducing storage and token requirements while ensuring lossless conversion
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines character transliteration with Huffman coding compression
Enables lossless reversible conversion for low-resource languages
Eliminates vocabulary expansion needs while improving efficiency
🔎 Similar Papers
2017-08-30Conference on Empirical Methods in Natural Language ProcessingCitations: 73
Wenhao Zhuang
Wenhao Zhuang
Kuaishou Technology
Natural Language Processing
Y
Yuan Sun
Minzu University of China, Beijing, China; National Language Resource Monitoring & Research Center Minority Languages Branch; Institute of National Security, Minzu University of China, Beijing, China
X
Xiaobing Zhao
Minzu University of China, Beijing, China; National Language Resource Monitoring & Research Center Minority Languages Branch; Institute of National Security, Minzu University of China, Beijing, China