🤖 AI Summary
This work addresses the challenges of mode collapse, limited solution diversity, and high computational cost commonly observed in verifier-free evolutionary methods. To overcome these limitations, the authors propose a unified multi-model collaboration framework that dynamically schedules strong and weak models based on marginal utility maximization—deploying high-performance models only at critical stages while relying on low-cost models otherwise—to effectively balance diversity and efficiency. Notably, this approach achieves performance on par with or surpassing that of verifier-dependent methods, despite operating entirely without a verifier, and supports flexible deployment across open-source, closed-source, and hybrid model configurations. Experimental results demonstrate new state-of-the-art performance across multiple benchmarks, approximately 3× reduction in API costs, up to 10× higher throughput under fixed budgets, and, for the first time in discovery-oriented tasks, parity with verifier-based approaches.
📝 Abstract
We show that verifier-free evolution is bottlenecked by both diversity and efficiency: without external correction, repeated evolution accelerates collapse toward narrow modes, while the uniform use of a high-cost model wastes compute and quickly becomes economically impractical. We introduce Squeeze Evolve, a unified multi-model orchestration framework for verifier-free evolutionary inference. Our approach is guided by a simple principle: allocate model capability where it has the highest marginal utility. Stronger models are reserved for high-impact stages, while cheaper models handle the other stages at much lower costs. This principle addresses diversity and cost-efficiency jointly while remaining lightweight. Squeeze Evolve naturally supports open-source, closed-source, and mixed-model deployments. Across AIME 2025, HMMT 2025, LiveCodeBench V6, GPQA-Diamond, ARC-AGI-V2, and multimodal vision benchmarks, such as MMMU-Pro and BabyVision, Squeeze Evolve consistently improves the cost-capability frontier over single-model evolution and achieves new state-of-the-art results on several tasks. Empirically, Squeeze Evolve reduces API cost by up to $\sim$3$\times$ and increases fixed-budget serving throughput by up to $\sim$10$\times$. Moreover, on discovery tasks, Squeeze Evolve is the first verifier-free evolutionary method to match, and in some cases exceed, the performance of verifier-based evolutionary methods.