Neuromorphic computing with analog memory can accelerate deep neural networks (DNNs) by enabling multiply accumulate (MAC) operations to occur within memory. Analog memory, however, presents a number of device-level challenges having macro-implications on the achievable accuracy and reliability of these artificial neural networks. This talk focuses on the adverse effects of conductance drift in phase-change memory (PCM) on network reliability. It is shown that conductance drift can be effectively compensated in a variety of networks by applying a ‘slope correction’ technique to the squashing functions to maintain accuracy/reliability for a period of ∼1 year. In addition to conductance drift, PCM poses considerable variability challenges, which impact the accuracy of the initial weights. This talk summarizes recent advances in optimizing initial weight programming, and provides evidence suggesting that the combination of ‘slope correction’ and programming optimization techniques may allow DNN acceleration using analog memory while maintaining software equivalent accuracy with reasonable reliability.