Graph is all you need? Lightweight data-agnostic neural architecture search without training

📅 2024-05-02
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the prohibitively high computational cost of training candidate architectures in neural architecture search (NAS), this paper proposes NASGraph—a zero-cost, data-agnostic lightweight method. Its core innovation lies in modeling neural networks as graphs and, for the first time, employing a graph topological feature—average degree—as a performance proxy, thereby eliminating both training and data dependencies. On NAS-Bench-201, NASGraph identifies the optimal architecture among 200 randomly sampled candidates in just 217 seconds with high accuracy. It further achieves competitive ranking correlation across multiple benchmarks, including NAS-Bench-101, NDS, and Micro TransNAS-Bench-101. By decoupling architecture evaluation from training and data, NASGraph significantly improves evaluation efficiency and cross-benchmark transferability, offering a scalable and practical solution for resource-constrained NAS.

Technology Category

Application Category

📝 Abstract
Neural architecture search (NAS) enables the automatic design of neural network models. However, training the candidates generated by the search algorithm for performance evaluation incurs considerable computational overhead. Our method, dubbed nasgraph, remarkably reduces the computational costs by converting neural architectures to graphs and using the average degree, a graph measure, as the proxy in lieu of the evaluation metric. Our training-free NAS method is data-agnostic and light-weight. It can find the best architecture among 200 randomly sampled architectures from NAS-Bench201 in 217 CPU seconds. Besides, our method is able to achieve competitive performance on various datasets including NASBench-101, NASBench-201, and NDS search spaces. We also demonstrate that nasgraph generalizes to more challenging tasks on Micro TransNAS-Bench-101.
Problem

Research questions and friction points this paper is trying to address.

Reduces computational costs in neural architecture search
Uses graph measures as proxy for evaluation metrics
Achieves competitive performance across diverse datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Converts neural architectures to graphs
Uses average degree as proxy metric
Training-free, data-agnostic NAS method
🔎 Similar Papers
No similar papers found.