Meta knowledge assisted Evolutionary Neural Architecture Search

📅 2025-04-30
🏛️ IEEE transactions on circuits and systems for video technology (Print)
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Evolutionary neural architecture search (ENAS) suffers from high architectural evaluation overhead and suboptimal generalization due to fixed learning rate schedules, leading to information loss during optimization. To address these issues, this paper proposes a meta-knowledge-driven efficient NAS framework. Its key contributions are: (1) a novel meta-learning-rate (Meta-LR) scheduling mechanism that dynamically adapts the learning rate to each architecture’s optimization trajectory; (2) an integrated strategy combining an adaptive surrogate model with periodic structural mutation operators to jointly enhance search efficiency, population diversity, and robustness; and (3) a dynamic threshold-based architecture selection strategy to accelerate convergence toward high-quality candidates. Evaluated on CIFAR-10, CIFAR-100, and ImageNet1K, the method achieves state-of-the-art (SOTA) accuracy while reducing training cost by 37–52%. Moreover, it significantly improves architectural generalization and training stability.

Technology Category

Application Category

📝 Abstract
Evolutionary computation (EC)-based neural architecture search (NAS) has achieved remarkable performance in the automatic design of neural architectures. However, the high computational cost associated with evaluating searched architectures poses a challenge for these methods, and a fixed form of learning rate (LR) schedule means greater information loss on diverse searched architectures. This paper introduces an efficient EC-based NAS method to solve these problems via an innovative meta-learning framework. Specifically, a meta-learning-rate (Meta-LR) scheme is used through pretraining to obtain a suitable LR schedule, which guides the training process with lower information loss when evaluating each individual. An adaptive surrogate model is designed through an adaptive threshold to select the potential architectures in a few epochs and then evaluate the potential architectures with complete epochs. Additionally, a periodic mutation operator is proposed to increase the diversity of the population, which enhances the generalizability and robustness. Experiments on CIFAR-10, CIFAR-100, and ImageNet1K datasets demonstrate that the proposed method achieves high performance comparable to that of many state-of-the-art peer methods, with lower computational cost and greater robustness.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational cost in Evolutionary NAS methods
Optimizing learning rate schedules for diverse architectures
Enhancing population diversity for better generalizability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Meta-learning-rate scheme for adaptive learning schedules
Adaptive surrogate model for efficient architecture evaluation
Periodic mutation operator to enhance population diversity
🔎 Similar Papers
No similar papers found.
Y
Yangyang Li
Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, Joint International Research Laboratory of Intelligent Perception and Computation, International Research Center for Intelligent Perception and Computation, Collaborative Innovation Center of Quantum Information of Shaanxi Province, School of Artificial Intelligence, Xidian University, Xi’an 710071, China
G
Guanlong Liu
Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, Joint International Research Laboratory of Intelligent Perception and Computation, International Research Center for Intelligent Perception and Computation, Collaborative Innovation Center of Quantum Information of Shaanxi Province, School of Artificial Intelligence, Xidian University, Xi’an 710071, China
Ronghua Shang
Ronghua Shang
Professor, Xidian University, Xi'an, China
Computation Intelligence Machine Learning Multiobjective Optimization
Licheng Jiao
Licheng Jiao
Distinguished Professor of Xidian University, IEEE Fellow
Neural NetworksComputational IntelligenceEvolutionary ComputationRemote SensingPattern Recognition.