DANCE: Resource-Efficient Neural Architecture Search with Data-Aware and Continuous Adaptation

📅 2025-07-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural architecture search (NAS) methods suffer from three critical bottlenecks: poor cross-scenario architectural adaptability, high search cost in single environments, and unstable cross-platform deployment performance. To address these, we propose a data-aware continuous adaptive NAS framework. Our method introduces: (1) continuous architectural distribution modeling with a learnable gating mechanism, enabling smooth, differentiable architecture evolution; (2) a multi-stage joint optimization strategy that unifies the search space for efficient cross-device adaptation; and (3) hardware-aware sampling grounded in distribution learning, jointly optimizing accuracy and deployment constraints. Evaluated on five benchmark datasets, our approach consistently outperforms state-of-the-art methods, reducing search overhead by 30–50% while maintaining robust performance across heterogeneous computational resources. To the best of our knowledge, this is the first work to achieve end-to-end hardware-adaptive architecture generation.

Technology Category

Application Category

📝 Abstract
Neural Architecture Search (NAS) has emerged as a powerful approach for automating neural network design. However, existing NAS methods face critical limitations in real-world deployments: architectures lack adaptability across scenarios, each deployment context requires costly separate searches, and performance consistency across diverse platforms remains challenging. We propose DANCE (Dynamic Architectures with Neural Continuous Evolution), which reformulates architecture search as a continuous evolution problem through learning distributions over architectural components. DANCE introduces three key innovations: a continuous architecture distribution enabling smooth adaptation, a unified architecture space with learned selection gates for efficient sampling, and a multi-stage training strategy for effective deployment optimization. Extensive experiments across five datasets demonstrate DANCE's effectiveness. Our method consistently outperforms state-of-the-art NAS approaches in terms of accuracy while significantly reducing search costs. Under varying computational constraints, DANCE maintains robust performance while smoothly adapting architectures to different hardware requirements. The code and appendix can be found at https://github.com/Applied-Machine-Learning-Lab/DANCE.
Problem

Research questions and friction points this paper is trying to address.

Enhance adaptability of neural architectures across scenarios
Reduce costly separate searches for each deployment context
Maintain performance consistency across diverse hardware platforms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Continuous architecture distribution enables smooth adaptation
Unified architecture space with learned selection gates
Multi-stage training strategy optimizes deployment effectively
🔎 Similar Papers
No similar papers found.
M
Maolin Wang
City University of Hong Kong
T
Tianshuo Wei
City University of Hong Kong
S
Sheng Zhang
City University of Hong Kong
Ruocheng Guo
Ruocheng Guo
Intuit AI Research
LLMsCausal MLData Mining
W
Wanyu Wang
City University of Hong Kong
S
Shanshan Ye
Australian Artificial Intelligence Institute, University of Technology Sydney
Lixin Zou
Lixin Zou
Wuhan University
Information RetrievalRecommender SystemReinforcement LearningLarge Language Model
Xuetao Wei
Xuetao Wei
Associate Professor, Southern University of Science and Technology
AI EthicsAI Safety
X
Xiangyu Zhao
City University of Hong Kong