๐ค AI Summary
To address the longstanding marginalization of low-resource Southeast Asian (SEA) languages in large language model (LLM) research, this work introduces the first open-source multilingual LLM family covering all major regional languages. Methodologically, it builds upon dual foundational modelsโLlama-3 and Gemma (8B/9B)โand employs a multi-stage pipeline: continual pretraining, hierarchical instruction fine-tuning, RLHF-based alignment, and parameter-efficient model fusion, enabling unified support for English, Chinese, and 11 SEA languages including Indonesian, Vietnamese, and Thai. Contributions include: (1) the first open-source multilingual LLM architecture with comprehensive SEA language coverage; (2) a progressive post-training framework specifically designed for low-resource language adaptation; and (3) state-of-the-art performance on a dedicated SEA multilingual benchmark, demonstrating substantial gains in both linguistic understanding and generative capability for local languages.
๐ Abstract
Recently, Large Language Models (LLMs) have dominated much of the artificial intelligence scene with their ability to process and generate natural languages. However, the majority of LLM research and development remains English-centric, leaving low-resource languages such as those in the Southeast Asian (SEA) region under-represented. To address this representation gap, we introduce Llama-SEA-LION-v3-8B-IT and Gemma-SEA-LION-v3-9B-IT, two cutting-edge multilingual LLMs designed for SEA languages. The SEA-LION family of LLMs supports 11 SEA languages, namely English, Chinese, Indonesian, Vietnamese, Malay, Thai, Burmese, Lao, Filipino, Tamil, and Khmer. Our work leverages large-scale multilingual continued pre-training with a comprehensive post-training regime involving multiple stages of instruction fine-tuning, alignment, and model merging. Evaluation results on multilingual benchmarks indicate that our models achieve state-of-the-art performance across LLMs supporting SEA languages. We open-source the models to benefit the wider SEA community.