Y.Y. Li, K.S. Leung, et al.
J Combin Optim
Associative networks theory is increasingly providing tools to interpret update rules of artificial neural networks. At the same time, deriving neural learning rules from a solid theory remains a fundamental challenge. We make some steps in this direction by considering general energy-based associative networks of continuous neurons and synapses that evolve in multiple time scales. We use the separation of these timescales to recover a limit in which the activation of the neurons, the energy of the system and the neural dynamics can all be recovered from a generating function. By allowing the generating function to depend on memories, we recover the conventional Hebbian modeling choice for the interaction strength between neurons. Finally, we propose and discuss a dynamics of memories that enables us to include learning in this framework.
Y.Y. Li, K.S. Leung, et al.
J Combin Optim
R.B. Morris, Y. Tsuji, et al.
International Journal for Numerical Methods in Engineering
Guo-Jun Qi, Charu Aggarwal, et al.
IEEE TPAMI
Martin Charles Golumbic, Renu C. Laskar
Discrete Applied Mathematics