Publication
DAC 2021
Poster

Reliable RRAM-based In-Memory Computing in Light of Model Stability

View publication

Abstract

RRAM-based in-memory computing (IMC) effectively accelerates DNNs. Quantization and pruning improve the hardware performance but aggravate the effect of RRAM device variations and, reduce the post-mapping accuracy. This work proposes model stability as a new metric to guide algorithmic solutions. Based on 65nm statistical RRAM data, we incorporate algorithm and architecture parameters to benchmark post-mapping accuracy and hardware performance. Furthermore, we develop a novel variation-aware training method to improve model stability, in which there exists an optimal scale of training variation for best accuracy. Experimental evaluation shows up to 21% improvement in post-mapping accuracy for CIFAR-10, CIFAR-100, and SVHN datasets.