Unveiling Language Routing Isolation in Multilingual MoE Models for Interpretable Subnetwork Adaptation

๐Ÿ“… 2026-04-04
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the significant performance disparity between high- and low-resource languages in multilingual Mixture-of-Experts (MoE) models, a gap exacerbated by the lack of interpretability in their routing mechanisms. The study uncovers, for the first time, a phenomenon termed language routing isolationโ€”where experts activated by high- and low-resource languages are largely disjoint. Building on this insight, the authors propose RISE, a framework that leverages layer-wise routing analysis, language-specificity scoring, and expert overlap quantification to identify and fine-tune language-specific subnetworks. Experiments across ten languages demonstrate that RISE improves F1 scores for target low-resource languages by up to 10.85% while preserving or minimally affecting performance on other languages, thereby enabling efficient and interpretable cross-lingual adaptation.
๐Ÿ“ Abstract
Mixture-of-Experts (MoE) models exhibit striking performance disparities across languages, yet the internal mechanisms driving these gaps remain poorly understood. In this work, we conduct a systematic analysis of expert routing patterns in MoE models, revealing a phenomenon we term Language Routing Isolation, in which high- and low-resource languages tend to activate largely disjoint expert sets. Through layer-stratified analysis, we further show that routing patterns exhibit a layer-wise convergence-divergence pattern across model depth. Building on these findings, we propose RISE (Routing Isolation-guided Subnetwork Enhancement), a framework that exploits routing isolation to identify and adapt language-specific expert subnetworks. RISE applies a tripartite selection strategy, using specificity scores to identify language-specific experts in shallow and deep layers and overlap scores to select universal experts in middle layers. By training only the selected subnetwork while freezing all other parameters, RISE substantially improves low-resource language performance while preserving capabilities in other languages. Experiments on 10 languages demonstrate that RISE achieves target-language F1 gains of up to 10.85% with minimal cross-lingual degradation.
Problem

Research questions and friction points this paper is trying to address.

Mixture-of-Experts
Language Routing Isolation
multilingual models
expert routing
low-resource languages
Innovation

Methods, ideas, or system contributions that make the work stand out.

Language Routing Isolation
Mixture-of-Experts
Subnetwork Adaptation
RISE
Multilingual Models
๐Ÿ”Ž Similar Papers
No similar papers found.