PromptDSI: Prompt-based Rehearsal-free Instance-wise Incremental Learning for Document Retrieval

📅 2024-06-18
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Differentiable Search Indexes (DSIs) suffer from prohibitive computational overhead in dynamic document collections due to full retraining requirements, while existing continual learning approaches rely on memory replay or generative rehearsal—unsuitable for privacy-sensitive settings. This paper proposes a replay-free, prompt-driven framework for incremental index updating. Its core contributions are: (1) a novel prompt adaptation mechanism that eliminates the need for forward reuse of past prompts; (2) a fixed-key prompt pool guided by neural topic embeddings to mitigate prompt key-value collapse; and (3) a lightweight, learnable prompt tuning architecture built upon a frozen pre-trained language model (PLM) encoder. Evaluated on NQ320k and MS MARCO 300k, the method improves Hits@10 by over 4% and MRR@10 by 3%, while achieving forgetting rates comparable to IncDSI—significantly enhancing both efficiency and practicality of incremental retrieval.

Technology Category

Application Category

📝 Abstract
Differentiable Search Index (DSI) utilizes Pre-trained Language Models (PLMs) for efficient document retrieval without relying on external indexes. However, DSI needs full re-training to handle updates in dynamic corpora, causing significant computational inefficiencies. We introduce PromptDSI, a prompt-based rehearsal-free approach for instance-wise incremental learning document retrieval. PromptDSI attaches prompts to the frozen PLM's encoder of DSI, leveraging its powerful representation to efficiently index new corpora while maintaining a balance between stability and plasticity. We eliminate the initial forward pass of prompt-based continual learning methods that doubles training and inference time. Moreover, we propose a topic-aware prompt pool that employs neural topic embeddings as fixed keys. This strategy ensures diverse and effective prompt usage, addressing the challenge of parameter underutilization caused by the collapse of the query-key matching mechanism. Our empirical evaluations demonstrate that BERT-based PromptDSI matches IncDSI in managing forgetting while improving new corpora performance by more than 4% Hits@10 on NQ320k and upto 3% MRR@10 on MS MARCO 300k.
Problem

Research questions and friction points this paper is trying to address.

Enables incremental document retrieval without full re-training
Eliminates need for memory buffers or previous data access
Improves retrieval latency while maintaining performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prompt-based continual learning for document retrieval
Topic-aware prompt pool with neural embeddings
Efficient indexing without accessing previous data
🔎 Similar Papers
No similar papers found.