Publication
IMW 2020
Conference paper

Accelerating Deep Neural Networks with Analog Memory Devices

View publication

Abstract

Analog memory offers enormous potential to speed up computation in deep learning. We study the use of phasechange memory (PCM) as the resistive element in a crossbar array that allows the multiply-accumulate operation in deep neural networks to be performed in-memory. With this promise comes several challenges, including this paper's main focus: The impact of conductance drift on deep neural network accuracy. Here we offer an overview of our recent work, including explanations of popular neural network architectures, along with a technique to compensate for drift ("slope correction") to allow in-memory computing with PCM during inference to reach software-equivalent deep learning baselines for a broad variety of important neural network workloads.