🤖 AI Summary
Addressing three core challenges in LLM-driven Neural Architecture Design (NAD)—non-differentiable feedback, mode collapse, and structural infeasibility—this paper proposes the Reflective Evolutionary Framework for NAD (REF-NAD). REF-NAD synergistically integrates multi-round, multi-expert consensus reasoning; reward-variance-driven adaptive reflective exploration; and Pareto-front-guided non-dominated sorting for evolutionary selection. This co-optimizes architectural reasoning and search while preserving diversity, significantly improving feasibility and deployment efficiency. Evaluated across diverse benchmarks—including CIFAR, ImageNet16-120, COCO-5K, and Cityscapes—REF-NAD achieves state-of-the-art performance, demonstrating strong cross-task generalization and stability. Its key contribution lies in reformulating the non-differentiable feedback problem as a reflective evolutionary modeling task and mitigating redundancy and feasibility drift via a dynamic multi-objective trade-off mechanism.
📝 Abstract
Recent progress in leveraging large language models (LLMs) has enabled Neural Architecture Design (NAD) systems to generate new architecture not limited from manually predefined search space. Nevertheless, LLM-driven generation remains challenging: the token-level design loop is discrete and non-differentiable, preventing feedback from smoothly guiding architectural improvement. These methods, in turn, commonly suffer from mode collapse into redundant structures or drift toward infeasible designs when constructive reasoning is not well grounded. We introduce RevoNAD, a reflective evolutionary orchestrator that effectively bridges LLM-based reasoning with feedback-aligned architectural search. First, RevoNAD presents a Multi-round Multi-expert Consensus to transfer isolated design rules into meaningful architectural clues. Then, Adaptive Reflective Exploration adjusts the degree of exploration leveraging reward variance; it explores when feedback is uncertain and refines when stability is reached. Finally, Pareto-guided Evolutionary Selection effectively promotes architectures that jointly optimize accuracy, efficiency, latency, confidence, and structural diversity. Across CIFAR10, CIFAR100, ImageNet16-120, COCO-5K, and Cityscape, RevoNAD achieves state-of-the-art performance. Ablation and transfer studies further validate the effectiveness of RevoNAD in allowing practically reliable, and deployable neural architecture design.