Publication
Solid-State Electronics
Paper

Analog neuromorphic computing using programmable resistor arrays

View publication

Abstract

Digital logic technology has been extraordinarily successful and has been fueled by incredible gains in integration density and performance achieved over the years following Moore's law. This has led to societal changes where more and more everyday functions are aided by smart devices following the path to artificial intelligence. However, in the field of deep machine learning, even this technology falls short, partly because device scaling gains are no longer easy to come by, but also due to intractable energy costs of computation. Deep learning, using labeled data, can be mapped onto artificial neural networks, arrays where the inputs and outputs are connected by programmable weights, and which can perform pattern recognition functions. The learning process consists of finding the optimum weights however this learning process is very slow for large problems. Exploiting the fact that weights do not need to be determined with high precision, as long as they can be updated precisely, the device community has recognized that analog computation approaches, using physical arrays of memristor (programmable resistor) type devices could offer significant speedup and power advantages compared to pure digital, or pure software approaches. On the other hand, the history of analog computation is not reassuring since the rule has been that more capable digital devices invariably supplant analog function. In this paper I will discuss the opportunities and limitations of using analog techniques to accelerate the learning process in resistive neural networks.

Date

01 May 2019

Publication

Solid-State Electronics

Authors

Share