🤖 AI Summary
This paper addresses the absence of reverse dictionary (RD) systems for Arabic—i.e., retrieving target words from semantic definitions—in Arabic NLP. Methodologically, it proposes a semi-encoder Transformer architecture with geometrically decaying layers, establishes an eight-dimensional lexical quality evaluation framework (covering definition completeness, accuracy, etc.), and implements end-to-end semantic matching using Arabic-specific pretrained models (e.g., ARBERTv2). Key contributions include: (1) the first formal theoretical formulation of the RD task for Arabic; (2) open-sourcing the modular toolkit RDTL and a high-quality dataset construction guideline; and (3) achieving state-of-the-art performance on a standard Arabic RD benchmark (MRR = 0.0644), demonstrating that Arabic-specific models significantly outperform multilingual baselines.
📝 Abstract
This study addresses the critical gap in Arabic natural language processing by developing an effective Arabic Reverse Dictionary (RD) system that enables users to find words based on their descriptions or meanings. We present a novel transformer-based approach with a semi-encoder neural network architecture featuring geometrically decreasing layers that achieves state-of-the-art results for Arabic RD tasks. Our methodology incorporates a comprehensive dataset construction process and establishes formal quality standards for Arabic lexicographic definitions. Experiments with various pre-trained models demonstrate that Arabic-specific models significantly outperform general multilingual embeddings, with ARBERTv2 achieving the best ranking score (0.0644). Additionally, we provide a formal abstraction of the reverse dictionary task that enhances theoretical understanding and develop a modular, extensible Python library (RDTL) with configurable training pipelines. Our analysis of dataset quality reveals important insights for improving Arabic definition construction, leading to eight specific standards for building high-quality reverse dictionary resources. This work contributes significantly to Arabic computational linguistics and provides valuable tools for language learning, academic writing, and professional communication in Arabic.