Babel: Open Multilingual Large Language Models Serving Over 90% of Global Speakers

📅 2025-03-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing open-source multilingual large language models (LLMs) exhibit strong bias toward high-resource languages, severely neglecting widely spoken yet data-scarce languages—leaving over one billion people globally without adequate high-quality language support. To address this gap, we introduce the Babel series of open-source multilingual LLMs, covering the world’s top 25 languages (serving >90% of the global population). We propose a novel “layer-expansion parameter growth” mechanism—replacing conventional continued pretraining—to enable efficient, scalable architecture evolution. The series comprises two model sizes: 9B and 83B parameters, trained via multi-stage, multilingual mixed pretraining and open-source supervised fine-tuning (SFT). Babel-9B-Chat achieves state-of-the-art performance among ~10B-parameter models, while Babel-83B-Chat matches commercial multilingual LLMs in capability and substantially outperforms same-scale open-source baselines—particularly for low-resource languages, thereby closing a critical modeling gap.

Technology Category

Application Category

📝 Abstract
Large language models (LLMs) have revolutionized natural language processing (NLP), yet open-source multilingual LLMs remain scarce, with existing models often limited in language coverage. Such models typically prioritize well-resourced languages, while widely spoken but under-resourced languages are often overlooked. To address this disparity, we introduce $ exttt{Babel}$, an open multilingual LLM that covers the top 25 languages by number of speakers, supports over 90% of the global population, and includes many languages neglected by other open multilingual LLMs. Unlike traditional continue pretraining approaches, Babel expands its parameter count through a layer extension technique that elevates Babel's performance ceiling. We introduce two variants: $ exttt{Babel-9B}$, designed for efficient inference and fine-tuning, and $ exttt{Babel-83B}$, which sets a new standard for open multilingual LLMs. Extensive evaluations on multilingual tasks demonstrate its superior performance compared to open LLMs of comparable size. In addition, using open-source supervised fine-tuning datasets, Babel achieves remarkable performance, with Babel-9B-Chat leading among 10B-sized LLMs and Babel-83B-Chat setting a new standard for multilingual tasks, reaching the same level of commercial models.
Problem

Research questions and friction points this paper is trying to address.

Addresses scarcity of open-source multilingual large language models.
Expands language coverage to include under-resourced languages.
Introduces scalable models for efficient inference and fine-tuning.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Open multilingual LLM covering 25 languages
Layer extension technique increases parameter count
Two variants: Babel-9B and Babel-83B
🔎 Similar Papers
No similar papers found.