DIETA: A Decoder-only Transformer-based Model for Italian-English Machine TrAnslation

📅 2026-01-25
🏛️ Italian Conference on Computational Linguistics
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a 0.5B-parameter decoder-only Transformer model for low-resource Italian–English machine translation. By curating and combining 207 million authentic parallel sentence pairs with 352 million back-translated examples, and introducing a novel evaluation benchmark constructed from WikiNews articles dated 2025, the study achieves substantial gains in translation quality. It presents the first dedicated small-scale decoder-only architecture for this language pair and publicly releases high-quality parallel corpora, a contemporary news test set, and training code. The model demonstrates strong performance across multiple benchmarks, consistently ranking in the second quartile among 32 systems and outperforming the majority of models with fewer than 3B parameters on four out of five test sets.

Technology Category

Application Category

📝 Abstract
In this paper, we present DIETA, a small, decoder-only Transformer model with 0.5 billion parameters, specifically designed and trained for Italian-English machine translation. We collect and curate a large parallel corpus consisting of approximately 207 million Italian-English sentence pairs across diverse domains, including parliamentary proceedings, legal texts, web-crawled content, subtitles, news, literature and 352 million back-translated data using pretrained models. Additionally, we create and release a new small-scale evaluation set, consisting of 450 sentences, based on 2025 WikiNews articles, enabling assessment of translation quality on contemporary text. Comprehensive evaluations show that DIETA achieves competitive performance on multiple Italian-English benchmarks, consistently ranking in the second quartile of a 32-system leaderboard and outperforming most other sub-3B models on four out of five test suites. The training script, trained models, curated corpus, and newly introduced evaluation set are made publicly available, facilitating further research and development in specialized Italian-English machine translation. https://github.com/pkasela/DIETA-Machine-Translation
Problem

Research questions and friction points this paper is trying to address.

Italian-English machine translation
decoder-only Transformer
parallel corpus
evaluation benchmark
low-resource MT
Innovation

Methods, ideas, or system contributions that make the work stand out.

decoder-only Transformer
Italian-English machine translation
curated parallel corpus
back-translation
contemporary evaluation benchmark
🔎 Similar Papers
No similar papers found.