Reducing Neural Architecture Search Spaces with Training-Free Statistics and Computational Graph Clustering

The venue, Castle of Valentino

Abstract

Neural Architecture Search (NAS) aims at discovering Deep Neural Network (DNN) topologies that have good task accuracy. NAS algorithms are often timeconsuming and computationally expensive; thus, being able to focus the search on those sub-spaces containing good candidates can greatly improve the performance of NAS algorithms. This work investigates how to efficiently identify highperforming subspaces of a given NAS space. We validate our ideas on the NAS-Bench-201 (NB201) dataset. NAS-Bench networks are built by concatenating six identical cells, which can be configured in 56 different ways. NB201 contains 56 = 15, 625 DNNs, each of which is annotated with its task accuracy over three different image classification data sets (CIFAR-10, CIFAR-100, ImageNet16-120).

Date
17, May, 2022
Location
Castle of Valentino, Turin, Italy
Viale Mattioli, 39, Turin, Turin 10125
Thorir Mar Ingolfsson
Thorir Mar Ingolfsson
PhD Student in Robust and Practical Machine Learning

My research interests include TinyML, Hardware Aware NAS and application of ML for bio-signals.