Feature learning using synaptic competition in a dynamically-sized neuromorphic architecture
Neuromorphic computing takes inspiration from how the brain works to explore novel computing paradigms. Recently, neuromorphic architectures using spiking neurons were proposed for unsupervised learning of pattern-and featurebased representations. These approaches typically use a common WTA architectural motif of lateral inhibition that introduces competition between the neurons. In this paper, we propose an alternative WTA architectural motif of synaptic competition that introduces competition between the synapses. As the WTA mechanism is moved from neurons to synapses, neuronal activation is not constrained during the learning, and the architecture operates well in an online setting. Moreover, we show how the synaptic competition results provide information about novelty in the input, based on which we develop a feature-learning architecture that dynamically adjusts its size to the complexity of the dataset. We compare the performance of the proposed architecture to lateral inhibition on the task of unsupervised feature learning. Finally, we demonstrate high-Accuracy results from an experimental realization of the architecture using a prototype neuromorphic platform with phase-change-based synapses.