The Effect of Hebbian Learning on Optimisation in Hopfield Networks
Watson, Richard A., Buckley, C. L. and Mills, Rob (2009) The Effect of Hebbian Learning on Optimisation in Hopfield Networks.
In neural networks, two specific dynamical behaviours are well known: 1) Networks naturally find patterns of activation that locally minimise constraints among interactions. This can be understood as the local minimisation of an energy or potential function, or the optimisation of an objective function. 2) In distinct scenarios, Hebbian learning can create new interactions that form associative memories of activation patterns. In this paper we show that these two behaviours have a surprising interaction – that learning of this type significantly improves the ability of a neural network to find configurations that satisfy constraints/perform effective optimisation. Specifically, the network develops a memory of the attractors that it has visited, but importantly, is able to generalise over previously visited attractors to increase the basin of attraction of superior attractors before they are visited. The network is ultimately transformed into a different network that has only one basin of attraction, but this attractor corresponds to a configuration that is very low energy in the original network. The new network thus finds optimised configurations that were unattainable (had exponentially small basins of attraction) in the original network dynamics.
|Item Type:||Monograph (Technical Report)|
|Divisions:||Faculty of Physical and Applied Science > Electronics and Computer Science > EEE
Faculty of Physical and Applied Science > Electronics and Computer Science > Agents, Interactions & Complexity
|Date Deposited:||09 Jun 2009 22:19|
|Last Modified:||02 Mar 2012 12:41|
|Contributors:||Watson, Richard A. (Author)
Buckley, C. L. (Author)
Mills, Rob (Author)
|Date:||10 June 2009|
|Further Information:||Google Scholar|
|RDF:||RDF+N-Triples, RDF+N3, RDF+XML, Browse.|
Actions (login required)