NER Retriever: Zero-Shot Named Entity Retrieval with Type-Aware Embeddings

📅 2025-09-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses zero-shot named entity retrieval: retrieving documents mentioning entities of a user-specified type (e.g., “scientists skilled in quantum computing”) without relying on a predefined entity type schema. Methodologically, we propose constructing fine-grained entity embeddings from value vectors in intermediate Transformer layers of large language models, and design a lightweight contrastive projection network to align entity mentions and natural-language type descriptions into a shared semantic space, enabling efficient nearest-neighbor retrieval. Compared to conventional lexical matching and dense sentence retrieval baselines, our approach achieves significant improvements in retrieval accuracy across three open-domain benchmarks. To the best of our knowledge, this is the first method that enables scalable, semantically precise zero-shot entity retrieval without requiring any pre-defined type ontology or schema.

Technology Category

Application Category

📝 Abstract
We present NER Retriever, a zero-shot retrieval framework for ad-hoc Named Entity Retrieval, a variant of Named Entity Recognition (NER), where the types of interest are not provided in advance, and a user-defined type description is used to retrieve documents mentioning entities of that type. Instead of relying on fixed schemas or fine-tuned models, our method builds on internal representations of large language models (LLMs) to embed both entity mentions and user-provided open-ended type descriptions into a shared semantic space. We show that internal representations, specifically the value vectors from mid-layer transformer blocks, encode fine-grained type information more effectively than commonly used top-layer embeddings. To refine these representations, we train a lightweight contrastive projection network that aligns type-compatible entities while separating unrelated types. The resulting entity embeddings are compact, type-aware, and well-suited for nearest-neighbor search. Evaluated on three benchmarks, NER Retriever significantly outperforms both lexical and dense sentence-level retrieval baselines. Our findings provide empirical support for representation selection within LLMs and demonstrate a practical solution for scalable, schema-free entity retrieval. The NER Retriever Codebase is publicly available at https://github.com/ShacharOr100/ner_retriever
Problem

Research questions and friction points this paper is trying to address.

Zero-shot named entity retrieval with user-defined types
Creating type-aware embeddings without fixed schemas
Refining LLM representations for effective entity-typing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses LLM internal value vectors for embeddings
Trains lightweight contrastive projection network
Creates compact type-aware embeddings for retrieval
🔎 Similar Papers
No similar papers found.