Nile-Chat: Egyptian Language Models for Arabic and Latin Scripts

πŸ“… 2025-07-06
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Addressing the challenges of modeling Egyptian Arabic’s multigraphic nature (Arabic and Latin scripts) and its low-resource adaptability, this paper proposes the Branch-Train-MiX strategy to develop Nile-Chat, the first Mixture-of-Experts (MoE) large language model supporting unified representation across both scripts. Our method employs script-specific expert branches, cross-script knowledge fusion, and fine-grained instruction tuning. Evaluated on a newly constructed bilingual benchmark for Egyptian Arabic, Nile-Chat achieves end-to-end improvements in both understanding and generation. Experiments show that Nile-Chat-12B outperforms Qwen2.5-14B-Instruct by 14.4% on Latin-script tasks and consistently surpasses LLaMA, Jais, and ALLaM baselines. Crucially, it establishes the first scalable, reusable modeling paradigm for multigraphic Arabic dialects. All models, datasets, and code are publicly released.

Technology Category

Application Category

πŸ“ Abstract
We introduce Nile-Chat-4B, 3x4B-A6B, and 12B, a collection of LLMs for Egyptian dialect, uniquely designed to understand and generate texts written in both Arabic and Latin scripts. Specifically, with Nile-Chat-3x4B-A6B, we introduce a novel language adaptation approach by leveraging the Branch-Train-MiX strategy to merge script-specialized experts, into a single MoE model. Our Nile-Chat models significantly outperform leading multilingual and Arabic LLMs, such as LLaMa, Jais, and ALLaM, on our newly introduced Egyptian evaluation benchmarks, which span both understanding and generative tasks. Notably, our 12B model yields a 14.4% performance gain over Qwen2.5-14B-Instruct on Latin-script benchmarks. All our resources are publicly available. We believe this work presents a comprehensive methodology for adapting LLMs to dual-script languages, addressing an often overlooked aspect in modern LLM development.
Problem

Research questions and friction points this paper is trying to address.

Develops LLMs for Egyptian dialect in Arabic and Latin scripts
Introduces novel language adaptation via Branch-Train-MiX strategy
Outperforms leading multilingual models on Egyptian benchmarks
Innovation

Methods, ideas, or system contributions that make the work stand out.

LLMs for Egyptian dialect with dual-script support
Branch-Train-MiX strategy for script-specialized experts
MoE model merging for language adaptation
πŸ”Ž Similar Papers
No similar papers found.