Typologically Informed Parameter Aggregation

πŸ“… 2026-01-23
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the limited performance of multilingual large language models on low-resource and unseen languages, as well as the high cost of training language-specific adapters. The authors propose a novel zero-shot cross-lingual transfer method that requires no additional training: for the first time, they incorporate linguistic typological similarity into the MAD-X framework to construct a proxy adapter for the target language by weighted aggregation of existing adapters. Evaluated across five NLP tasks and over 230 languages, the approach consistently matches or significantly outperforms current baselines, with particularly notable gains for languages lacking dedicated adapters. This study thus establishes an efficient, training-free paradigm for cross-lingual transfer, offering a practical solution for low-resource language scenarios.

Technology Category

Application Category

πŸ“ Abstract
Massively multilingual language models enable cross-lingual generalization but underperform on low-resource and unseen languages. While adapter-based fine-tuning offers a parameter-efficient solution, training language-specific adapters at scale remains costly. We introduce Typologically Informed Parameter Aggregation (TIPA), a training-free method that constructs proxy language adapters by aggregating existing ones, weighted by typological similarity. Integrated into the MAD-X framework, these proxies enable zero-shot cross-lingual transfer without additional training. We evaluate TIPA on five NLP tasks and over 230 languages. TIPA consistently outperforms or matches baselines such as English-only fine-tuning or selecting the typologically closest language adapter. We see the largest gains for languages lacking dedicated adapters. Our results demonstrate that typologically informed aggregation provides a viable alternative to language-specific modules without any training needed.
Problem

Research questions and friction points this paper is trying to address.

multilingual language models
low-resource languages
cross-lingual transfer
adapter-based fine-tuning
zero-shot learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Typologically Informed Parameter Aggregation
adapter-based fine-tuning
zero-shot cross-lingual transfer
multilingual language models
language typology
πŸ”Ž Similar Papers
No similar papers found.
S
Stef Accou
LAGOMΒ·NLP, Department of Computer Science, KU Leuven; Department of Linguistics, KU Leuven
Wessel Poelman
Wessel Poelman
PhD candidate at KU Leuven
Natural Language ProcessingComputational LinguisticsMultilingual NLP