Beyond English: Toward Inclusive and Scalable Multilingual Machine Translation with LLMs

📅 2025-11-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address critical challenges in massively multilingual machine translation (MMT)—including imbalanced language coverage, poor performance on low-resource languages, and English-centric bias—this paper introduces LMT, a large-scale MMT model family centered on Chinese–English and supporting 60 languages across 234 translation directions. We first identify and characterize the “directional degradation” phenomenon in multilingual training, then propose strategic downsampling and Parallel Multilingual Prompting (PMP) to significantly enhance cross-lingual transfer. Furthermore, we design fine-grained adaptation strategies for efficient multilingual fine-tuning. Experiments demonstrate that LMT achieves state-of-the-art (SOTA) performance under comparable language coverage: its 4B-parameter variant comprehensively outperforms Aya-101-13B and NLLB-54B across diverse benchmarks. The LMT models are fully open-sourced and support flexible deployment across multiple parameter scales.

Technology Category

Application Category

📝 Abstract
Large language models have significantly advanced Multilingual Machine Translation (MMT), yet the broad language coverage, consistent translation quality, and English-centric bias remain open challenges. To address these challenges, we introduce extbf{LMT}, a suite of extbf{L}arge-scale extbf{M}ultilingual extbf{T}ranslation models centered on both Chinese and English, covering 60 languages and 234 translation directions. During development, we identify a previously overlooked phenomenon of extbf{directional degeneration}, where symmetric multi-way fine-tuning data overemphasize reverse directions (X $ o$ En/Zh), leading to excessive many-to-one mappings and degraded translation quality. We propose extbf{Strategic Downsampling}, a simple yet effective method to mitigate this degeneration. In addition, we design extbf{Parallel Multilingual Prompting (PMP)}, which leverages typologically related auxiliary languages to enhance cross-lingual transfer. Through rigorous data curation and refined adaptation strategies, LMT achieves SOTA performance among models of comparable language coverage, with our 4B model (LMT-60-4B) surpassing the much larger Aya-101-13B and NLLB-54B models by a substantial margin. We release LMT in four sizes (0.6B/1.7B/4B/8B) to catalyze future research and provide strong baselines for inclusive, scalable, and high-quality MMT footnote{href{https://github.com/NiuTrans/LMT}{https://github.com/NiuTrans/LMT}}.
Problem

Research questions and friction points this paper is trying to address.

Addressing English-centric bias in multilingual machine translation systems
Mitigating directional degeneration in symmetric multi-way translation models
Improving translation quality across 60 languages and 234 directions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Developed large-scale multilingual translation models covering 60 languages
Introduced strategic downsampling to mitigate directional degeneration
Designed parallel multilingual prompting for enhanced cross-lingual transfer
Y
Yingfeng Luo
School of Computer Science and Engineering, Northeastern University, Shenyang, China
Z
Ziqiang Xu
School of Computer Science and Engineering, Northeastern University, Shenyang, China
Y
Yuxuan Ouyang
School of Computer Science and Engineering, Northeastern University, Shenyang, China
M
Murun Yang
School of Computer Science and Engineering, Northeastern University, Shenyang, China
D
Dingyang Lin
School of Computer Science and Engineering, Northeastern University, Shenyang, China
K
Kaiyan Chang
School of Computer Science and Engineering, Northeastern University, Shenyang, China
Tong Zheng
Tong Zheng
University of Maryland, College Park; Northeastern University
Machine TranslationLanguage ModelingReasoningInference
Bei Li
Bei Li
Meituan LLM Team
Machine TranslationDeep LearningLarge Language Models
P
Peinan Feng
School of Computer Science and Engineering, Northeastern University, Shenyang, China
Q
Quan Du
NiuTrans Research, Shenyang, China
T
Tong Xiao
School of Computer Science and Engineering, Northeastern University, Shenyang, China; NiuTrans Research, Shenyang, China
Jingbo Zhu
Jingbo Zhu
Northeastern University, China
Machine TranslationLanguage ParsingNatural Language Processing