JGU Mainz's Submission to the WMT25 Shared Task on LLMs with Limited Resources for Slavic Languages: MT and QA

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses performance bottlenecks in machine translation (MT) and question answering (QA) for low-resource Slavic languages—specifically Ukrainian, Upper Sorbian, and Lower Sorbian. To overcome data scarcity, we propose a parameter-efficient, multi-task fine-tuning framework built upon Qwen2.5-3B-Instruct. Our method employs Parameter-Efficient Fine-Tuning (PEFT) to jointly optimize MT and multiple-choice QA tasks using heterogeneous training data from diverse translation corpora and QA benchmarks. We further enhance contextual understanding via Retrieval-Augmented Generation (RAG) and improve generalization through model ensembling. Experimental results demonstrate consistent and significant improvements over strong baselines across both tasks, particularly under few-shot settings. The approach exhibits strong robustness and cross-task transferability, validating its effectiveness for low-resource Slavic language processing. Overall, this work provides a scalable, resource-conscious methodology for advancing NLP applications in under-resourced Slavic languages.

Technology Category

Application Category

📝 Abstract
This paper presents the JGU Mainz submission to the WMT25 Shared Task on LLMs with Limited Resources for Slavic Languages: Machine Translation and Question Answering, focusing on Ukrainian, Upper Sorbian, and Lower Sorbian. For each language, we jointly fine-tune a Qwen2.5-3B-Instruct model for both tasks with parameter-efficient finetuning. Our pipeline integrates additional translation and multiple-choice question answering (QA) data. For Ukrainian QA, we further use retrieval-augmented generation. We also apply ensembling for QA in Upper and Lower Sorbian. Experiments show that our models outperform the baseline on both tasks.
Problem

Research questions and friction points this paper is trying to address.

Developing machine translation for Slavic languages with limited resources
Creating question answering systems for Ukrainian and Sorbian languages
Optimizing models for both translation and QA using efficient finetuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter-efficient fine-tuning of Qwen2.5 model
Integration of translation and QA data
Retrieval-augmented generation and ensembling techniques
🔎 Similar Papers
No similar papers found.