Large-scale neural networks implemented with Non-Volatile Memory as the synaptic weight element: Impact of conductance response
We assess the impact of the conductance response of Non-Volatile Memory (NVM) devices employed as the synaptic weight element for on-chip acceleration of the training of large-scale artificial neural networks (ANN). We briefly review our previous work towards achieving competitive performance (classification accuracies) for such ANN with both Phase-Change Memory (PCM) ,  and non-filamentary ReRAM based on PrCaMnO (PCMO) , and towards assessing the potential advantages for ML training over GPU-based hardware in terms of speed (up to 25× faster) and power (from 120-2850× lower power) . We then discuss the 'jump-table' concept, previously introduced to model real-world NVM such as PCM  or PCMO, to describe the full cumulative distribution function (CDF) of conductance-change at each device conductance value, for both potentiation (SET) and depression (RESET). Using several types of artificially-constructed jump-tables, we assess the relative importance of deviations from an ideal NVM with perfectly linear conductance response.