In-Domain African Languages Translation Using LLMs and Multi-armed Bandits

📅 2025-05-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Domain adaptation for low-resource African languages faces severe constraints—scarce or no in-domain labeled data and prohibitive computational costs preventing fine-tuning of neural machine translation (NMT) models. Method: This paper proposes an online model selection framework based on multi-armed bandits (MAB), systematically adapting UCB, Linear UCB, Neural Linear Bandit, and Thompson sampling to zero-shot and few-shot NMT domain adaptation—enabling training-free, dynamic selection of the optimal model per input. Contribution/Results: Evaluated on three African languages across domain-specific test sets, the method consistently outperforms random and static model selection under both in-domain data availability and complete absence thereof, yielding BLEU gains of 2.4–4.1 points. It establishes a scalable, high-confidence paradigm for unsupervised domain adaptation in low-resource NMT, eliminating reliance on labeled data or model retraining.

Technology Category

Application Category

📝 Abstract
Neural Machine Translation (NMT) systems face significant challenges when working with low-resource languages, particularly in domain adaptation tasks. These difficulties arise due to limited training data and suboptimal model generalization, As a result, selecting an optimal model for translation is crucial for achieving strong performance on in-domain data, particularly in scenarios where fine-tuning is not feasible or practical. In this paper, we investigate strategies for selecting the most suitable NMT model for a given domain using bandit-based algorithms, including Upper Confidence Bound, Linear UCB, Neural Linear Bandit, and Thompson Sampling. Our method effectively addresses the resource constraints by facilitating optimal model selection with high confidence. We evaluate the approach across three African languages and domains, demonstrating its robustness and effectiveness in both scenarios where target data is available and where it is absent.
Problem

Research questions and friction points this paper is trying to address.

Addressing low-resource African language translation challenges
Optimizing NMT model selection using bandit algorithms
Improving domain adaptation without fine-tuning feasibility
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using bandit algorithms for optimal NMT model selection
Addressing low-resource languages with multi-armed bandits
Evaluating across African languages with limited data
🔎 Similar Papers
No similar papers found.