VOYAGER: A Training Free Approach for Generating Diverse Datasets using LLMs

📅 2025-12-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing LLM-based synthetic data generation methods suffer from insufficient diversity. To address this, we propose a training-free, closed-model-compatible iterative diversity optimization framework. Our approach explicitly models semantic similarity among data instances using Determinantal Point Processes (DPPs), and integrates zero-shot prompting with iterative reweighted sampling—enabling controllable diversity enhancement without model fine-tuning. The framework is theoretically grounded, interpretable, and highly scalable. Extensive experiments across multi-task synthetic data generation scenarios demonstrate that our method improves dataset diversity by 1.5–3× over state-of-the-art baselines, leading to substantial gains in downstream task generalization performance.

Technology Category

Application Category

📝 Abstract
Large language models (LLMs) are increasingly being used to generate synthetic datasets for the evaluation and training of downstream models. However, prior work has noted that such generated data lacks diversity. In this paper, we propose Voyager, a novel principled approach to generate diverse datasets. Our approach is iterative and directly optimizes a mathematical quantity that optimizes the diversity of the dataset using the machinery of determinantal point processes. Furthermore, our approach is training-free, applicable to closed-source models, and scalable. In addition to providing theoretical justification for the working of our method, we also demonstrate through comprehensive experiments that Voyager significantly outperforms popular baseline approaches by providing a 1.5-3x improvement in diversity.
Problem

Research questions and friction points this paper is trying to address.

Generates diverse synthetic datasets using LLMs
Optimizes dataset diversity via determinantal point processes
Provides a training-free, scalable method for closed-source models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Training-free iterative diversity optimization
Determinantal point processes for dataset generation
Scalable approach for closed-source LLMs
🔎 Similar Papers
No similar papers found.