ZO-DARTS++: An Efficient and Size-Variable Zeroth-Order Neural Architecture Search Algorithm

📅 2025-03-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inefficiency, inflexible operation selection, and poor adaptability to multiple resource constraints in differentiable Neural Architecture Search (NAS) for medical imaging, this paper proposes a novel efficient NAS framework. Methodologically, it (1) replaces first-order backpropagation with zeroth-order gradient approximation to reduce computational overhead; (2) introduces a temperature-annealed sparsemax distribution to enhance architectural interpretability and sparsity in operation selection; and (3) constructs a dynamically scalable width search space enabling fine-grained, elastic adaptation to heterogeneous resource constraints. The framework enables end-to-end differentiable joint optimization of architecture parameters and hyperparameters. Evaluated on medical image datasets, it achieves an average accuracy improvement of 1.8% and reduces search time by 38.6%. Its lightweight variant compresses model parameters by over 35% with negligible accuracy degradation, significantly outperforming existing baselines.

Technology Category

Application Category

📝 Abstract
Differentiable Neural Architecture Search (NAS) provides a promising avenue for automating the complex design of deep learning (DL) models. However, current differentiable NAS methods often face constraints in efficiency, operation selection, and adaptability under varying resource limitations. We introduce ZO-DARTS++, a novel NAS method that effectively balances performance and resource constraints. By integrating a zeroth-order approximation for efficient gradient handling, employing a sparsemax function with temperature annealing for clearer and more interpretable architecture distributions, and adopting a size-variable search scheme for generating compact yet accurate architectures, ZO-DARTS++ establishes a new balance between model complexity and performance. In extensive tests on medical imaging datasets, ZO-DARTS++ improves the average accuracy by up to 1.8% over standard DARTS-based methods and shortens search time by approximately 38.6%. Additionally, its resource-constrained variants can reduce the number of parameters by more than 35% while maintaining competitive accuracy levels. Thus, ZO-DARTS++ offers a versatile and efficient framework for generating high-quality, resource-aware DL models suitable for real-world medical applications.
Problem

Research questions and friction points this paper is trying to address.

Improves efficiency and adaptability in neural architecture search.
Balances model complexity and performance under resource constraints.
Enhances accuracy and reduces search time for medical imaging models.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Zeroth-order approximation for efficient gradient handling
Sparsemax function with temperature annealing for interpretability
Size-variable search scheme for compact architectures
🔎 Similar Papers
No similar papers found.