The Interpolation Capabilities of the Binary CMAC
The Interpolation Capabilities of the Binary CMAC
Within the neurocontrol field the CMAC has often been proposed as a basic learning element because of its simple design and its rapid learning rate. However despite it being used as a general nonlinear functional approximator, very little theory exists about the interpolation capabilities of the binary CMAC. In this paper the binary CMAC is described and an expression for the memory requirements is derived. Then it is shown that the basis function vectors span a space which has a dimension dependent on the length of the weight vector and the generalisation parameter. From these two expressions it is possible to measure the space of the functions which the CMAC can interpolate and it is shown that, in general, the binary CMAC cannot reproduce an arbitrary multivariate look-up table. This provides an upper bound on the size of the class of functions which the multivariate CMAC can interpolate. Next it is shown that if the desired multivariate (look-up table) function is formed from a linear combination of bounded univariate piecewise constant functions then there exists a well-defined binary CMAC which is output equivalent to the desired function. This provides a lower bound on the type of functions that the multivariate CMAC can interpolate. A set of consistency equations is then derived which the training data must satisfy if the CMAC is required to store the information exactly. Finally these consistency equations are used in the construction of a set of desired functions which the CMAC is completely unable to model. Many neural networks can approximate continuous nonlinear functions arbitrarily well, given infinite resources. However this work is aimed at a more useful measure of the modelling (approximation) capabilities of a network: that is, given a particular network structure (finite resources) what functions can be modelled exactly?
429-440
Brown, M.
52cf4f52-6839-4658-8cc5-ec51da626049
Harris, C.J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
Parks, P.C.
b24e4625-eaa3-481c-8332-e9641d33aa99
1993
Brown, M.
52cf4f52-6839-4658-8cc5-ec51da626049
Harris, C.J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
Parks, P.C.
b24e4625-eaa3-481c-8332-e9641d33aa99
Brown, M., Harris, C.J. and Parks, P.C.
(1993)
The Interpolation Capabilities of the Binary CMAC.
Neural Networks, 6 (3), .
Abstract
Within the neurocontrol field the CMAC has often been proposed as a basic learning element because of its simple design and its rapid learning rate. However despite it being used as a general nonlinear functional approximator, very little theory exists about the interpolation capabilities of the binary CMAC. In this paper the binary CMAC is described and an expression for the memory requirements is derived. Then it is shown that the basis function vectors span a space which has a dimension dependent on the length of the weight vector and the generalisation parameter. From these two expressions it is possible to measure the space of the functions which the CMAC can interpolate and it is shown that, in general, the binary CMAC cannot reproduce an arbitrary multivariate look-up table. This provides an upper bound on the size of the class of functions which the multivariate CMAC can interpolate. Next it is shown that if the desired multivariate (look-up table) function is formed from a linear combination of bounded univariate piecewise constant functions then there exists a well-defined binary CMAC which is output equivalent to the desired function. This provides a lower bound on the type of functions that the multivariate CMAC can interpolate. A set of consistency equations is then derived which the training data must satisfy if the CMAC is required to store the information exactly. Finally these consistency equations are used in the construction of a set of desired functions which the CMAC is completely unable to model. Many neural networks can approximate continuous nonlinear functions arbitrarily well, given infinite resources. However this work is aimed at a more useful measure of the modelling (approximation) capabilities of a network: that is, given a particular network structure (finite resources) what functions can be modelled exactly?
This record has no associated files available for download.
More information
Published date: 1993
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 250259
URI: http://eprints.soton.ac.uk/id/eprint/250259
PURE UUID: 9e8112f4-e46a-4e1d-8ad2-562178d8700f
Catalogue record
Date deposited: 04 May 1999
Last modified: 10 Dec 2021 20:07
Export record
Contributors
Author:
M. Brown
Author:
C.J. Harris
Author:
P.C. Parks
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics