A feed-forward SNU-based version of this architecture achieved test perplexity of 137.7, which is better than that of traditional natural language processing approaches, such as the ‘5-gram’ method. To the best of our knowledge, this is the first example of language modelling performed with SNNs on the Penn Treebank dataset, and our result sets the SNN state-of-the-art performance. Application of sSNUs with recurrent connections improved the result down to 108.4, which surpassed that of the corresponding LSTM-based architecture without dropout.
The task of polyphonic music prediction on the Johann Sebastian Bach dataset was to predict at each time step the set of notes, i.e. a chord, to be played in the consecutive time step. We used an SNU-based architecture with an output layer of sigmoidal neurons that allows a direct comparison of the obtained loss values to these from ANNs. The SNU-based network achieved an average loss of 8.72 and set the SNN state-of-the-art performance for the Bach chorales dataset. An sSNU-based network further reduced the average loss to 8.39 and surpassed corresponding architectures using state-of-the-art ANN units.
For a long time, SNN and ANN research on algorithms and AI hardware accelerator architectures have been developing separately, but in this paper we bridged these neural network architectures by proposing an SNU that incorporates the biologically-inspired neural dynamics in the form of a novel ANN unit, offering broad adoption of biologically-inspired neural dynamics in challenging applications and opening new avenues for neuromorphic hardware acceleration.
Woźniak, S., Pantazi, A., Bohnstingl, T. et al. Deep learning incorporating biologically inspired neural dynamics and in-memory computing. Nat Mach Intell 2, 325–336 (2020). ↩