Deep neural networks have gained attention in the last decade as significant progress has been made in a variety of tasks thanks to these new architectures. Most of the time, hand-designed networks are responsible for this incredible success. However, this engineering process demands considerable time and expert knowledge, which leads to an increasing interest in automating the design of deep architectures. Several new algorithms have been proposed to address the neural architecture search problem, but many of them require significant computational resources. Quantum-inspired evolutionary algorithms (QIEA) have their roots on quantum computing principles and present promising results in respect to faster convergence. In this work, we propose Q-NAS (Quantum-inspired Neural Architecture Search): a quantum-inspired algorithm to search for deep neural architectures by assembling substructures and optimizing some numerical hyperparameters. We present the first results applying Q-NAS on the CIFAR-10 dataset using only 20 K80 GPUs for about 50 hours. The obtained networks are relatively small (less than 20 layers) compared to other state-of-the-art models and achieve promising accuracies with considerably less computational cost than other NAS algorithms.