🤖 AI Summary
Generic text embedding models suffer significant performance degradation in retrieval tasks over enterprise private data—particularly when such data contains abundant domain-specific terminology. To address this, we propose BMEmbed, the first method to leverage unsupervised BM25 ranking outputs as supervision signals for contrastive learning, enabling lightweight, efficient, and annotation-free domain adaptation. Theoretically and empirically, we demonstrate that BM25-derived signals jointly optimize both alignment and uniformity of the learned embedding space. BMEmbed is model-agnostic and compatible with diverse foundation embedding models (e.g., BGE, E5). Extensive experiments across multiple private domain datasets show an average 12.7% improvement in Mean Reciprocal Rank (MRR), confirming strong generalization capability. Our implementation is publicly available.
📝 Abstract
Text embedding models play a cornerstone role in AI applications, such as retrieval-augmented generation (RAG). While general-purpose text embedding models demonstrate strong performance on generic retrieval benchmarks, their effectiveness diminishes when applied to private datasets (e.g., company-specific proprietary data), which often contain specialized terminology and lingo. In this work, we introduce BMEmbed, a novel method for adapting general-purpose text embedding models to private datasets. By leveraging the well-established keyword-based retrieval technique (BM25), we construct supervisory signals from the ranking of keyword-based retrieval results to facilitate model adaptation. We evaluate BMEmbed across a range of domains, datasets, and models, showing consistent improvements in retrieval performance. Moreover, we provide empirical insights into how BM25-based signals contribute to improving embeddings by fostering alignment and uniformity, highlighting the value of this approach in adapting models to domain-specific data. We release the source code available at https://github.com/BaileyWei/BMEmbed for the research community.