About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
NICE 2019
Conference paper
Synaptic plasticity in an artificial Hebbian network exhibiting continuous, unsupervised, rapid learning
Abstract
CAL (Context Aware Learning) is an artificial Hebbian network that simulates neurological structure and processes. Hebb's rules for synapse creation and weight update permit CAL to learn rapidly, from few examples, continuously and without supervision. This paper describes recent, neurologically inspired, modifications to the algorithms that lead to faster learning and to greater accuracy of prediction. The importance of plasticity, both structural and in synaptic weights, is illustrated. A mechanism to limit the plasticity of the most relevant synapses leads to minimal forgetting of previously learned sequences.