Neural Architecture Search (NAS) aims at discovering Deep Neural Network (DNN) topologies that have good task accuracy. NAS algorithms are often timeconsuming and computationally expensive; thus, being able to focus the search on those sub-spaces containing good candidates can greatly improve the performance of NAS algorithms. This work investigates how to efficiently identify highperforming subspaces of a given NAS space. We validate our ideas on the NAS-Bench-201 (NB201) dataset. NAS-Bench networks are built by concatenating six identical cells, which can be configured in 56 different ways. NB201 contains 56 = 15, 625 DNNs, each of which is annotated with its task accuracy over three different image classification data sets (CIFAR-10, CIFAR-100, ImageNet16-120).