Robert Manson Sawko, Malgorzata Zimon
SIAM/ASA JUQ
Associative networks theory is increasingly providing tools to interpret update rules of artificial neural networks. At the same time, deriving neural learning rules from a solid theory remains a fundamental challenge. We make some steps in this direction by considering general energy-based associative networks of continuous neurons and synapses that evolve in multiple time scales. We use the separation of these timescales to recover a limit in which the activation of the neurons, the energy of the system and the neural dynamics can all be recovered from a generating function. By allowing the generating function to depend on memories, we recover the conventional Hebbian modeling choice for the interaction strength between neurons. Finally, we propose and discuss a dynamics of memories that enables us to include learning in this framework.
Robert Manson Sawko, Malgorzata Zimon
SIAM/ASA JUQ
S.F. Fan, W.B. Yun, et al.
Proceedings of SPIE 1989
Leo Liberti, James Ostrowski
Journal of Global Optimization
Bruce Kitchens, Klaus Schmidt
Ergodic Theory and Dynamical Systems