SiamNAS: Siamese Surrogate Model for Dominance Relation Prediction in Multi-objective Neural Architecture Search

📅 2025-06-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Multi-objective neural architecture search (NAS) suffers from high computational overhead in Pareto dominance evaluation and expensive crowding distance computation. This paper proposes a lightweight Siamese-network-based surrogate model that innovatively predicts pairwise architectural dominance directly—bypassing costly ground-truth evaluations and conventional crowding distance calculations. It further incorporates a model-size heuristic to efficiently maintain population diversity and supports multi-task co-optimization as well as Sets of Pareto Sets (SOS) generation. Evaluated on NAS-Bench-201, the method completes search in just 0.01 GPU-days, yielding the optimal architecture on CIFAR-10 and a near-optimal one on ImageNet-16-120. Dominance prediction accuracy reaches 92%, substantially reducing computational cost while significantly improving the efficiency of multi-objective NAS.

Technology Category

Application Category

📝 Abstract
Modern neural architecture search (NAS) is inherently multi-objective, balancing trade-offs such as accuracy, parameter count, and computational cost. This complexity makes NAS computationally expensive and nearly impossible to solve without efficient approximations. To address this, we propose a novel surrogate modelling approach that leverages an ensemble of Siamese network blocks to predict dominance relationships between candidate architectures. Lightweight and easy to train, the surrogate achieves 92% accuracy and replaces the crowding distance calculation in the survivor selection strategy with a heuristic rule based on model size. Integrated into a framework termed SiamNAS, this design eliminates costly evaluations during the search process. Experiments on NAS-Bench-201 demonstrate the framework's ability to identify Pareto-optimal solutions with significantly reduced computational costs. The proposed SiamNAS identified a final non-dominated set containing the best architecture in NAS-Bench-201 for CIFAR-10 and the second-best for ImageNet, in terms of test error rate, within 0.01 GPU days. This proof-of-concept study highlights the potential of the proposed Siamese network surrogate model to generalise to multi-tasking optimisation, enabling simultaneous optimisation across tasks. Additionally, it offers opportunities to extend the approach for generating Sets of Pareto Sets (SOS), providing diverse Pareto-optimal solutions for heterogeneous task settings.
Problem

Research questions and friction points this paper is trying to address.

Predicts dominance relations in multi-objective NAS efficiently
Reduces computational costs in neural architecture search
Identifies Pareto-optimal solutions with high accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Siamese network surrogate predicts dominance relationships
Heuristic rule replaces crowding distance calculation
SiamNAS reduces computational costs significantly
🔎 Similar Papers
No similar papers found.
Yuyang Zhou
Yuyang Zhou
University of Nottingham Ningbo China
Ferrante Neri
Ferrante Neri
Professor of Machine Learning and Artificial Intelligence, NICE group, University of Surrey
Heuristic OptimisationNeural Architecture SearchFeature SelectionMachine LearningP Systems
Y
Y. Ong
Nanyang Technological University, Centre for Frontier AI Research, Institute of High Performance Computing, Agency for Science, Technology and Research
R
Ruibin Bai
University of Nottingham Ningbo China