Zero-Shot Neural Architecture Search with Weighted Response Correlation

📅 2025-07-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing zero-shot neural architecture search (NAS) methods suffer from high evaluation costs and insufficient stability and generalization. To address this, we propose Weighted Response Correlation (WRCor), a training-free proxy that rapidly quantifies architectural expressivity and generalization capability by analyzing the correlation matrix of neural responses across input samples. This work introduces WRCor—the first application of weighted response correlation to zero-shot NAS—enhanced with an ensemble voting mechanism to improve robustness. WRCor is algorithm-agnostic and compatible with diverse search strategies and architecture spaces. On ImageNet-1k, our method identifies a high-performance architecture achieving a top-1 test error of 22.1% using only 4 GPU-hours, outperforming most existing NAS approaches. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Neural architecture search (NAS) is a promising approach for automatically designing neural network architectures. However, the architecture estimation of NAS is computationally expensive and time-consuming because of training multiple architectures from scratch. Although existing zero-shot NAS methods use training-free proxies to accelerate the architecture estimation, their effectiveness, stability, and generality are still lacking. We present a novel training-free estimation proxy called weighted response correlation (WRCor). WRCor utilizes correlation coefficient matrices of responses across different input samples to calculate the proxy scores of estimated architectures, which can measure their expressivity and generalizability. Experimental results on proxy evaluation demonstrate that WRCor and its voting proxies are more efficient estimation strategies than existing proxies. We also apply them with different search strategies in architecture search. Experimental results on architecture search show that our zero-shot NAS algorithm outperforms most existing NAS algorithms in different search spaces. Our NAS algorithm can discover an architecture with a 22.1% test error on the ImageNet-1k dataset within 4 GPU hours. All codes are publicly available at https://github.com/kunjing96/ZSNAS-WRCor.git.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational cost in neural architecture search
Improving zero-shot NAS effectiveness and stability
Enhancing architecture expressivity and generalizability measurement
Innovation

Methods, ideas, or system contributions that make the work stand out.

Weighted response correlation for training-free estimation
Correlation coefficient matrices measure expressivity and generalizability
Voting proxies enhance efficiency in architecture search
🔎 Similar Papers
No similar papers found.