Neuromorphic computing embraces the 'device history' offered by many analog non-volatile memory (NVM) devices to implement the small weight changes computed by a gradient-descent learning algorithm such as backpropagation. Deterministic and stochastic imperfections in the conductance response of real NVM devices can be encapsulated for modeling within a pair of 'jump-tables.' Such jump-tables describe the full cumulative distribution function of conductance-change at each device conductance value, for both weight potentiation (SET) and depression (RESET). First, using several types of artificially constructed jump-tables, we revisit the relative importance of deviations from an ideal NVM with perfectly linear conductance response. Then, using jump-tables measured on improved non-filamentary resistive RAM devices based on Pr0.7Ca0.3MnO3[see companion paper], we simulate the effects of their nonlinear conductance response on the training of a three-layer fully connected neural network. We find that, despite the relatively large conductance changes exhibited by any Pr0.7Ca0.3MnO3 device when either potentiating from its lowest conductance state or depressing from its highest conductance states, neural network training accuracies of >90% can be achieved. Highest accuracies are achieved by programming both conductances on each timestep ('fully bidirectional'), with the improved conductance on/off ratio of Al/Mo/PCMO resulting in marked improvements in training and test accuracy. Further accuracy improvements can be obtained by tuning the relative learning rate for potentiation (SET) by a factor of 1.66× with respect to depression (RESET), to offset the slight asymmetry between the average size of the associated SET and RESET conductance changes. Finally, we show that the bidirectional programming of Al/Mo/PCMO can be used to implement high-density neuromorphic systems with a single conductance per synapse, at only a slight degradation to accuracy.