InPars+: Supercharging Synthetic Data Generation for Information Retrieval Systems

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing synthetic query generation methods suffer from low signal quality and static prompt templates, limiting their effectiveness in neural information retrieval (NIR). To address these limitations, this paper proposes an enhanced synthetic query generation framework tailored for NIR. Methodologically: (1) Contrastive Preference Optimization (CPO) is introduced to perform fine-grained alignment training of the query generator, thereby improving its ability to discriminate relevance between generated queries and target documents; (2) A dynamic Chain-of-Thought (CoT) prompting mechanism is built upon the DSPy framework, enabling context-aware, adaptive prompt generation and reducing reliance on post-hoc strong filtering. Experiments on the SciFact benchmark demonstrate significant improvements in both re-ranking and end-to-end generative retrieval performance. All code and datasets are publicly released.

Technology Category

Application Category

📝 Abstract
This work revisits and extends synthetic query generation pipelines for Neural Information Retrieval (NIR) by leveraging the InPars Toolkit, a reproducible, end-to-end framework for generating training data using large language models (LLMs). We first assess the reproducibility of the original InPars, InPars-V2, and Promptagator pipelines on the SciFact benchmark and validate their effectiveness using open-source reranker and generator models. Building on this foundation, we introduce two key extensions to the pipeline: (1) fine-tuning a query generator LLM via Contrastive Preference Optimization (CPO) to improve the signal quality in generated queries, and (2) replacing static prompt templates with dynamic, Chain-of-Thought (CoT) optimized prompts using the DSPy framework. Our results show that both extensions reduce the need for aggressive filtering while improving retrieval performance. All code, models, and synthetic datasets are publicly released to support further research at: href{https://github.com/danilotpnta/IR2-project}{this https URL}.
Problem

Research questions and friction points this paper is trying to address.

Enhancing synthetic query generation for neural information retrieval systems
Improving LLM-generated query quality through contrastive preference optimization
Replacing static prompts with dynamic chain-of-thought optimized prompts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fine-tuning LLM via Contrastive Preference Optimization
Replacing static prompts with dynamic CoT prompts
Using DSPy framework for optimized prompt generation
🔎 Similar Papers
No similar papers found.
M
Matey Krastev
University of Amsterdam, Amsterdam, the Netherlands
M
Miklos Hamar
University of Amsterdam, Amsterdam, the Netherlands
D
Danilo Toapanta
University of Amsterdam, Amsterdam, the Netherlands
J
Jesse Brouwers
University of Amsterdam, Amsterdam, the Netherlands
Yibin Lei
Yibin Lei
University of Amsterdam
Information RetrievalNatural Language Processing