π€ AI Summary
Addressing the challenges of modeling Egyptian Arabicβs multigraphic nature (Arabic and Latin scripts) and its low-resource adaptability, this paper proposes the Branch-Train-MiX strategy to develop Nile-Chat, the first Mixture-of-Experts (MoE) large language model supporting unified representation across both scripts. Our method employs script-specific expert branches, cross-script knowledge fusion, and fine-grained instruction tuning. Evaluated on a newly constructed bilingual benchmark for Egyptian Arabic, Nile-Chat achieves end-to-end improvements in both understanding and generation. Experiments show that Nile-Chat-12B outperforms Qwen2.5-14B-Instruct by 14.4% on Latin-script tasks and consistently surpasses LLaMA, Jais, and ALLaM baselines. Crucially, it establishes the first scalable, reusable modeling paradigm for multigraphic Arabic dialects. All models, datasets, and code are publicly released.
π Abstract
We introduce Nile-Chat-4B, 3x4B-A6B, and 12B, a collection of LLMs for Egyptian dialect, uniquely designed to understand and generate texts written in both Arabic and Latin scripts. Specifically, with Nile-Chat-3x4B-A6B, we introduce a novel language adaptation approach by leveraging the Branch-Train-MiX strategy to merge script-specialized experts, into a single MoE model. Our Nile-Chat models significantly outperform leading multilingual and Arabic LLMs, such as LLaMa, Jais, and ALLaM, on our newly introduced Egyptian evaluation benchmarks, which span both understanding and generative tasks. Notably, our 12B model yields a 14.4% performance gain over Qwen2.5-14B-Instruct on Latin-script benchmarks. All our resources are publicly available. We believe this work presents a comprehensive methodology for adapting LLMs to dual-script languages, addressing an often overlooked aspect in modern LLM development.