Representation Collapse in Machine Translation Through the Lens of Angular Dispersion

📅 2026-02-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the issue of representation collapse in deep Transformers for neural machine translation, which leads to inefficient utilization of geometric space—particularly pronounced in end-to-end continuously trained and quantized models. The study presents the first systematic analysis of how representation collapse dynamically evolves across different Transformer layers during training. To mitigate this problem, the authors propose a regularization method based on angular dispersion. Experimental results demonstrate that the proposed approach effectively enhances translation quality, yielding consistent performance gains in both standard and quantized models while maintaining robustness and computational efficiency.

Technology Category

Application Category

📝 Abstract
Modern neural translation models based on the Transformer architecture are known for their high performance, particularly when trained on high-resource datasets. A standard next-token prediction training strategy, while widely adopted in practice, may lead to overlooked artifacts such as representation collapse. Previous works have shown that this problem is especially pronounced in the representation of the deeper Transformer layers, where it often fails to efficiently utilize the geometric space. Representation collapse is even more evident in end-to-end training of continuous-output neural machine translation, where the trivial solution would be to set all vectors to the same value. In this work, we analyze the dynamics of representation collapse at different levels of discrete and continuous NMT transformers throughout training. We incorporate an existing regularization method based on angular dispersion and demonstrate empirically that it not only mitigates collapse but also improves translation quality. Furthermore, we show that quantized models exhibit similar collapse behavior and that the benefits of regularization are preserved even after quantization.
Problem

Research questions and friction points this paper is trying to address.

representation collapse
neural machine translation
Transformer
angular dispersion
quantization
Innovation

Methods, ideas, or system contributions that make the work stand out.

representation collapse
angular dispersion
neural machine translation
quantization
regularization
🔎 Similar Papers
No similar papers found.