About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
IJCNN 2019
Conference paper
Quantum-Inspired Neural Architecture Search
Abstract
Deep neural networks have gained attention in the last decade as significant progress has been made in a variety of tasks thanks to these new architectures. Most of the time, hand-designed networks are responsible for this incredible success. However, this engineering process demands considerable time and expert knowledge, which leads to an increasing interest in automating the design of deep architectures. Several new algorithms have been proposed to address the neural architecture search problem, but many of them require significant computational resources. Quantum-inspired evolutionary algorithms (QIEA) have their roots on quantum computing principles and present promising results in respect to faster convergence. In this work, we propose Q-NAS (Quantum-inspired Neural Architecture Search): a quantum-inspired algorithm to search for deep neural architectures by assembling substructures and optimizing some numerical hyperparameters. We present the first results applying Q-NAS on the CIFAR-10 dataset using only 20 K80 GPUs for about 50 hours. The obtained networks are relatively small (less than 20 layers) compared to other state-of-the-art models and achieve promising accuracies with considerably less computational cost than other NAS algorithms.