Building Symmetries into Feedforward Networks
Building Symmetries into Feedforward Networks
One of the central tools developed by M. Minsky and S. Papert (1988) was the group invariance theorem. This theorem is concerned with choosing perceptron weights to recognise a predicate that is invariant under a group of permutations of the input. The theorem states that the weights can be chosen to be constant for equivalence classes of predicates under the action of the group. This paper presents this result in a graph theoretic light and then extends consideration to multilayer perceptrons. It is shown that, by choosing a multilayer network in such a way that the action of the group on the input nodes can be extended to the whole network, the invariance of the output under the action of the group can be guaranteed. This greatly reduces the number of degrees of freedom in the training of such a network. An example of using this technique to train a network to recognise isomorphism classes of graphs is given. This compares favourably with previous experiments using standard back-propagation. The connections between the group of symmetries and the network structure are explored and the relation to the problem of graph isomorphism is discussed
0 85296 388 2
158-162
Institution of Electrical Engineers
Shawe-Taylor, John
b1931d97-fdd0-4bc1-89bc-ec01648e928b
1989
Shawe-Taylor, John
b1931d97-fdd0-4bc1-89bc-ec01648e928b
Shawe-Taylor, John
(1989)
Building Symmetries into Feedforward Networks.
In Proceedings of First IEE Conference on Artificial Neural Networks, London.
Institution of Electrical Engineers.
.
Record type:
Conference or Workshop Item
(Paper)
Abstract
One of the central tools developed by M. Minsky and S. Papert (1988) was the group invariance theorem. This theorem is concerned with choosing perceptron weights to recognise a predicate that is invariant under a group of permutations of the input. The theorem states that the weights can be chosen to be constant for equivalence classes of predicates under the action of the group. This paper presents this result in a graph theoretic light and then extends consideration to multilayer perceptrons. It is shown that, by choosing a multilayer network in such a way that the action of the group on the input nodes can be extended to the whole network, the invariance of the output under the action of the group can be guaranteed. This greatly reduces the number of degrees of freedom in the training of such a network. An example of using this technique to train a network to recognise isomorphism classes of graphs is given. This compares favourably with previous experiments using standard back-propagation. The connections between the group of symmetries and the network structure are explored and the relation to the problem of graph isomorphism is discussed
This record has no associated files available for download.
More information
Published date: 1989
Organisations:
Electronics & Computer Science
Identifiers
Local EPrints ID: 259738
URI: http://eprints.soton.ac.uk/id/eprint/259738
ISBN: 0 85296 388 2
PURE UUID: 60ef35ff-489b-4d2c-9f5f-c4ed603a0de6
Catalogue record
Date deposited: 12 Aug 2004
Last modified: 04 Mar 2024 18:12
Export record
Contributors
Author:
John Shawe-Taylor
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics