Publication
IRPS 2023
Conference paper

Impact of Phase-Change Memory Drift on Energy Efficiency and Accuracy of Analog Compute-in-Memory Deep Learning Inference (Invited)

View publication

Abstract

Among the emerging approaches for deep learning acceleration, compute-in-memory (CIM) in crossbar arrays, in conjunction with optimized digital computation and communication, is attractive for achieving high execution speeds and energy efficiency. Analog phase-change memory (PCM) is particularly promising for this purpose. However, resistance typically drifts, which can degrade deep learning accuracy over time. Herein, we first discuss drift and noise mitigation by integrating projection liners into analog mushroom-type PCM devices, as well as tradeoffs with dynamic range. We then study their impact on inference accuracy for the Transformer-based language model BERT. We find that accuracy loss after extended drift can be minimal with an optimized mapping of weights to cells comprising two pairs of liner PCM devices of varying significance. Finally, we address the impact of drift on energy consumption during inference through a combination of drift, circuit, and architecture simulations. For a range of typical drift coefficients, we show that the peak vector-matrix multiplication (VMM) energy efficiency of a recently proposed heterogeneous CIM accelerator in 14 nm technology can increase by 3% to 15% over the course of one day to ten years. For convolutional neural network (CNN), long short-term memory (LSTM) and Transformer benchmarks, the increase in sustained energy efficiency remains below 10%, being greatest for models dominated by analog computation. Longer VMM integration times increase the energy impact of drift.