Neural Architecture Search (NAS) aims at discovering Deep Neural Network (DNN) topologies that have good task accuracy. NAS algorithms are often time-consuming and computationally expensive. Moreover, they usually ignore the limitations of embedded or edge computing devices. To design better HW for DNNs and better DNNs for constrained HW, we must understand the recurring features of task-accurate architectures. This work investigates how to efficiently explore NAS spaces looking for task-accurate HW-constrained DNNs and how to design better search spaces for HW-constrained DNNs.