AILS-NTUA at SemEval-2025 Task 3: Leveraging Large Language Models and Translation Strategies for Multilingual Hallucination Detection

📅 2025-03-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Multilingual hallucination detection faces challenges due to scarce annotated data and poor cross-lingual generalization. To address this, we propose Translation-Augmented Prompting (TAP), a training-free framework that uniformly translates multilingual inputs into English and leverages large language models for zero-shot hallucination detection. TAP further incorporates semantic consistency verification to enhance robustness against translation artifacts. Crucially, it requires no fine-tuning or additional training, significantly lowering the adaptation barrier for low-resource languages. Evaluated on the Mu-SHROOM shared task, TAP demonstrates strong and consistent performance across all 12 languages, achieving first place on low-resource languages—including Swahili and Bengali—thereby validating the universality and strong cross-lingual generalization capability of the translation-augmented strategy for multilingual hallucination detection.

Technology Category

Application Category

📝 Abstract
Multilingual hallucination detection stands as an underexplored challenge, which the Mu-SHROOM shared task seeks to address. In this work, we propose an efficient, training-free LLM prompting strategy that enhances detection by translating multilingual text spans into English. Our approach achieves competitive rankings across multiple languages, securing two first positions in low-resource languages. The consistency of our results highlights the effectiveness of our translation strategy for hallucination detection, demonstrating its applicability regardless of the source language.
Problem

Research questions and friction points this paper is trying to address.

Multilingual hallucination detection in understudied languages.
Training-free LLM prompting for enhanced detection accuracy.
Translation strategy effectiveness across diverse source languages.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Training-free LLM prompting strategy
Translation of multilingual text to English
Effective across multiple low-resource languages
🔎 Similar Papers
No similar papers found.