Ensemble Self-Training for Unsupervised Machine Translation

📅 2026-03-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the performance bottleneck in unsupervised neural machine translation (UNMT) caused by the absence of high-quality parallel data by proposing an ensemble self-training framework that leverages auxiliary languages to induce model diversity. The approach integrates multiple structurally diverse UNMT models and employs token-level ensemble decoding to generate high-quality pseudo-translations for the primary language pair, which are then used as synthetic parallel data for self-training, thereby enabling cross-model sharing of supervision signals. While maintaining the inference cost of a single model, the method achieves statistically significant improvements in translation quality, yielding average gains of 1.7 chrF in English-to-X directions and 0.67 chrF in X-to-English directions.

Technology Category

Application Category

📝 Abstract
We present an ensemble-driven self-training framework for unsupervised neural machine translation (UNMT). Starting from a primary language pair, we train multiple UNMT models that share the same translation task but differ in an auxiliary language, inducing structured diversity across models. We then generate pseudo-translations for the primary pair using token-level ensemble decoding, averaging model predictions in both directions. These ensemble outputs are used as synthetic parallel data to further train each model, allowing the models to improve via shared supervision. At deployment time, we select a single model by validation performance, preserving single-model inference cost. Experiments show statistically significant improvements over single-model UNMT baselines, with mean gains of 1.7 chrF when translating from English and 0.67 chrF when translating into English.
Problem

Research questions and friction points this paper is trying to address.

unsupervised machine translation
neural machine translation
pseudo-translations
model diversity
synthetic parallel data
Innovation

Methods, ideas, or system contributions that make the work stand out.

ensemble self-training
unsupervised neural machine translation
structured diversity
token-level ensemble decoding
synthetic parallel data