About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICIP 2022
Conference paper
Approximating Relu Networks by Single-Spike Computation
Abstract
Developing energy-saving neural network models is a topic of rapidly increasing interest in the artificial intelligence community. Spiking neural networks (SNNs) are biologically inspired models that strive to leverage the energy efficiency stemming from a long process of evolution under limited resources. In this paper we propose a SNN model where each neuron integrates piecewise linear postsynaptic potentials caused by input spikes and a positive bias, and spikes maximally once. Transformation of such a network into the ANN domain yields an approximation of a standard ReLU network, leading to a facilitated training based on backpropagation and an adaptation of the batch normalization. With backpropagation-trained weights, SNN inference offers a sparse-signal and low-latency classification, which can be readily adapted for a stream of input patterns, lending itself to an efficient hardware implementation. The supervised classification of MNIST and Fashion-MNIST datasets, using this approach, provides accuracy close to that of an ANN and surpassing other single-spike SNNs.