Manuel Le Gallo, Riduan Khaddam-Aljameh, et al.
Nature Electronics
A critical bottleneck for the training of large neural networks (NNs) is communication with off-chip memory. A promising mitigation effort consists of integrating crossbar arrays of analogue memories in the Back-End-Of-Line, to store the NN parameters and efficiently perform the required synaptic operations. The “Tiki-Taka” algorithm was developed to facilitate NN training in the presence of device nonidealities. However, so far, a resistive switching device exhibiting all the fundamental Tiki-Taka requirements, which are many programmable states, a centered symmetry point, and low programming noise, was not yet demonstrated. Here, a complementary metal-oxide semiconductor (CMOS)-compatible resistive random access memory (RRAM), showing more than 30 programmable states with low noise and a symmetry point with only 5% skew from the center, is presented for the first time. These results enable generalization of Tiki-Taka training from small fully connected networks to larger long-/short-term-memory types of NN.
Manuel Le Gallo, Riduan Khaddam-Aljameh, et al.
Nature Electronics
Laura Begon-Lours, Mattia Halter, et al.
IMW 2021
Pritish Narayanan
VLSI Technology and Circuits 2025
Ankur Agrawal, Saekyu Lee, et al.
ISSCC 2021