Mitigating Semantic Leakage in Cross-lingual Embeddings via Orthogonality Constraint

📅 2024-09-24
🏛️ Workshop on Representation Learning for NLP
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In cross-lingual sentence embedding, incomplete disentanglement of semantic and language representations leads to semantic leakage—i.e., language-specific information contaminating semantic representations, thereby impairing semantic transferability. This work formally defines and systematically addresses this issue for the first time, proposing ORACLE, an orthogonal constraint learning framework. ORACLE achieves fine-grained disentanglement atop multilingual BERT via end-to-end joint optimization of orthogonality between semantic and language subspaces, augmented by intra-class clustering and inter-class separation losses. Unlike conventional adversarial or projection-based methods, ORACLE avoids gradient instability and suboptimal solutions. Experiments demonstrate significant improvements on cross-lingual retrieval and semantic textual similarity tasks: semantic leakage is reduced by 32%, and cross-lingual semantic alignment improves by 19% on average.

Technology Category

Application Category

📝 Abstract
Accurately aligning contextual representations in cross-lingual sentence embeddings is key for effective parallel data mining. A common strategy for achieving this alignment involves disentangling semantics and language in sentence embeddings derived from multilingual pre-trained models. However, we discover that current disentangled representation learning methods suffer from semantic leakage—a term we introduce to describe when a substantial amount of language-specific information is unintentionally leaked into semantic representations. This hinders the effective disentanglement of semantic and language representations, making it difficult to retrieve embeddings that distinctively represent the meaning of the sentence. To address this challenge, we propose a novel training objective, ORthogonAlity Constraint LEarning (ORACLE), tailored to enforce orthogonality between semantic and language embeddings. ORACLE builds upon two components: intra-class clustering and inter-class separation. Through experiments on cross-lingual retrieval and semantic textual similarity tasks, we demonstrate that training with the ORACLE objective effectively reduces semantic leakage and enhances semantic alignment within the embedding space.
Problem

Research questions and friction points this paper is trying to address.

Mitigating semantic leakage in cross-lingual embedding representations
Enhancing semantic alignment for cross-lingual parallel data mining
Separating language-specific information from semantic representations effectively
Innovation

Methods, ideas, or system contributions that make the work stand out.

Orthogonality constraint for semantic leakage reduction
Intra-class clustering and inter-class separation
Novel training objective ORACLE for embeddings
🔎 Similar Papers
No similar papers found.