CAL (Context Aware Learning) is an artificial Hebbian network that simulates neurological structure and processes. Hebb's rules for synapse creation and weight update permit CAL to learn rapidly, from few examples, continuously and without supervision. This paper describes recent, neurologically inspired, modifications to the algorithms that lead to faster learning and to greater accuracy of prediction. The importance of plasticity, both structural and in synaptic weights, is illustrated. A mechanism to limit the plasticity of the most relevant synapses leads to minimal forgetting of previously learned sequences.