John R. Kender, Rick Kjeldsen
IEEE Transactions on Pattern Analysis and Machine Intelligence
Associative networks theory is increasingly providing tools to interpret update rules of artificial neural networks. At the same time, deriving neural learning rules from a solid theory remains a fundamental challenge. We make some steps in this direction by considering general energy-based associative networks of continuous neurons and synapses that evolve in multiple time scales. We use the separation of these timescales to recover a limit in which the activation of the neurons, the energy of the system and the neural dynamics can all be recovered from a generating function. By allowing the generating function to depend on memories, we recover the conventional Hebbian modeling choice for the interaction strength between neurons. Finally, we propose and discuss a dynamics of memories that enables us to include learning in this framework.
John R. Kender, Rick Kjeldsen
IEEE Transactions on Pattern Analysis and Machine Intelligence
Corneliu Constantinescu
SPIE Optical Engineering + Applications 2009
Leo Liberti, James Ostrowski
Journal of Global Optimization
William Hinsberg, Joy Cheng, et al.
SPIE Advanced Lithography 2010