Reducing Neural Architecture Search Spaces with Training-Free Statistics and Computational Graph Clustering

The venue, Castle of Valentino

Abstract

Neural Architecture Search (NAS) aims at discovering Deep Neural Network (DNN) topologies that have good task accuracy. NAS algorithms are often timeconsuming and computationally expensive; thus, being able to focus the search on those sub-spaces containing good candidates can greatly improve the performance of NAS algorithms. This work investigates how to efficiently identify highperforming subspaces of a given NAS space. We validate our ideas on the NAS-Bench-201 (NB201) dataset. NAS-Bench networks are built by concatenating six identical cells, which can be configured in 56 different ways. NB201 contains 56 = 15, 625 DNNs, each of which is annotated with its task accuracy over three different image classification data sets (CIFAR-10, CIFAR-100, ImageNet16-120).

Date
17, May, 2022
Location
Castle of Valentino, Turin, Italy
Viale Mattioli, 39, Turin, Turin 10125
Thorir Mar Ingolfsson
Thorir Mar Ingolfsson
Postdoctoral Researcher

I develop efficient machine learning systems for biomedical wearables that operate under extreme resource constraints. My work bridges foundation models, neural architecture design, and edge deployment to enable real-time biosignal analysis on microwatt-scale devices.

Related