Publication
MRS Fall Meeting 2020
Talk

DNN Training Algorithm for Nonsymmetric Devices

View publication

Abstract

Resistive cross-point devices arrays that can simultaneously store and process data locally and all in parallel, are promising candidates for intensive deep neural network (DNN) training workloads. However, when the training is performed using the conventional state of the art learning algorithm, stochasticity gradient descent (SGD) using backpropagation (BP) algorithm, these array architectures must meet a set of stringent device requirements for the cross-point element. A key requirement is that the resistive devices must change conductance in a symmetrical fashion when subjected to positive and negative pulse stimuli. Here, we present an alternative training algorithm, so-called the Tiki-Taka algorithm, so that this stringent symmetry requirement of the cross-point element can be relaxed to a very large extend. We tested the validity of the Tiki-Taka algorithm on a range of network architectures such as fully connected, convolutional and LSTM networks. For all these networks, the Tiki-Taka algorithm with nonsymmetric devices characteristics results in training accuracies that are in par with the ones achieved with the conventional SGD algorithm with symmetric devices. Thanks to the relaxed device specification, this algorithmic improvement would enable earlier adaptation of crossbar-based hardware accelerators for DNN training workloads.

Date

Publication

MRS Fall Meeting 2020

Authors

Share