Nano Letters

Analog resistive switching devices for training deep neural networks with the novel Tiki-Taka algorithm

View publication


A critical bottleneck for the training of large Neural Networks (NN) is the communication with off-chip memory. A promising mitigation effort consists of integrating crossbar arrays of analog memories in the Back-End-Of-Line, to store the NN parameters and efficiently perform the required synaptic operations. The 'Tiki-Taka' algorithm was developed to facilitate NN training in presence of device non-idealities. However, so far, a resistive switching device exhibiting all the fundamental 'Tiki-Taka' requirements, which are many programmable states, a centered symmetry point and low programming noise, was not yet demonstrated. Here, a Complementary-Metal-Oxide-Semiconductor (CMOS)-compatible Resistive Random Access Memory (RRAM), showing more than 30 programmable states with low noise and a symmetry point with only 5% skew from the center is presented for the first time. These results enable to generalize 'Tiki-Taka' training from small Fully-Connected-Networks to larger Long-Short-Term-Memory types of NN.