🤖 AI Summary
To address the low search efficiency of deep generative models in discrete spaces during test-time optimization, this paper proposes Neural Genetic Search (NGS)—a gradient-free, plug-and-play framework. NGS tightly integrates the population-based evolutionary dynamics of genetic algorithms with pre-trained deep generative models (e.g., diffusion and autoregressive models). Crucially, it reformulates crossover as a parent-conditioned generative sampling process—enabling natural compatibility with arbitrary discrete structures. The method comprises four stages: population initialization, selection, generative crossover, and stochastic mutation. Evaluated on three distinct tasks—path planning, adversarial prompt generation for large language models, and molecular design—NGS consistently outperforms existing baselines. Results demonstrate its superior efficiency, robustness to problem formulation, and strong cross-domain generalization capability.
📝 Abstract
Effective search methods are crucial for improving the performance of deep generative models at test time. In this paper, we introduce a novel test-time search method, Neural Genetic Search (NGS), which incorporates the evolutionary mechanism of genetic algorithms into the generation procedure of deep models. The core idea behind NGS is its crossover, which is defined as parent-conditioned generation using trained generative models. This approach offers a versatile and easy-to-implement search algorithm for deep generative models. We demonstrate the effectiveness and flexibility of NGS through experiments across three distinct domains: routing problems, adversarial prompt generation for language models, and molecular design.