Publication
ArabWIC 2021
Conference paper

Accelerating neural architecture search with rank-preserving surrogate models

View publication

Abstract

Over the past years, deep learning has enabled significant progress in several tasks, such as image recognition, speech recognition and language modelling. Novel Neural architectures are behind this achievement. However, manually designing these architectures by human experts is time-consuming and error-prone. Neural architecture search (NAS) automates the design process by searching for the best architecture in a huge search space. This search process requires evaluating each sampled architecture via time-consuming training. To speed up NAS algorithms, several existing approaches use surrogate models that predict the neural architectures' precision instead of training each sampled one. In this paper, we propose RS-NAS for Rank-preserving Surrogate model in NAS, a surrogate model trained with a rank-preserving loss function. We posit that the search algorithm doesn't need to know the exact accuracy of a candidate architecture but instead needs to know if it is better or worse than others. We thoroughly experiment and validate our surrogate models with state-of-the-art search algorithms. Using the rank-preserving surrogate models, local search in DARTS finds a 2% more accurate architecture than using the NAS-Bench-301 surrogate model on the same search time. The code and models are available: https://github.com/IHIaadj/ranked_nas.

Date

25 Aug 2021

Publication

ArabWIC 2021

Share