🤖 AI Summary
To address the poor generalization and weak domain adaptability of single universal embedding models in cross-domain retrieval, this paper proposes a dynamic routing retrieval framework based on Mixture of Experts (MoE). Without requiring joint training, the framework employs a lightweight query router to select, in real time, the most suitable domain-specific expert embedding model. It supports zero-shot addition and removal of experts and maintains strong generalization—even in the absence of a matching domain expert. The core contribution is the first integration of a scalable, modular expert routing mechanism into information retrieval. Evaluated on the BEIR benchmark, the method achieves a +2.1 nDCG@10 improvement over the MSMARCO baseline, outperforms multi-task models by +3.2 points, and demonstrates an average +1.8-point advantage in routing accuracy over existing approaches.
📝 Abstract
Information retrieval methods often rely on a single embedding model trained on large, general-domain datasets like MSMARCO. While this approach can produce a retriever with reasonable overall performance, they often underperform models trained on domain-specific data when testing on their respective domains. Prior work in information retrieval has tackled this through multi-task training, but the idea of routing over a mixture of domain-specific expert retrievers remains unexplored despite the popularity of such ideas in language model generation research. In this work, we introduce RouterRetriever, a retrieval model that leverages a mixture of domain-specific experts by using a routing mechanism to select the most appropriate expert for each query. RouterRetriever is lightweight and allows easy addition or removal of experts without additional training. Evaluation on the BEIR benchmark demonstrates that RouterRetriever outperforms both models trained on MSMARCO (+2.1 absolute nDCG@10) and multi-task models (+3.2). This is achieved by employing our routing mechanism, which surpasses other routing techniques (+1.8 on average) commonly used in language modeling. Furthermore, the benefit generalizes well to other datasets, even in the absence of a specific expert on the dataset. RouterRetriever is the first work to demonstrate the advantages of routing over a mixture of domain-specific expert embedding models as an alternative to a single, general-purpose embedding model, especially when retrieving from diverse, specialized domains.