Orthogonal least squares learning algorithm for radial basis function networks
Orthogonal least squares learning algorithm for radial basis function networks
The radial basis function network offers a viable alternative to the two-layer neural network in many applications of signal processing. A common learning algorithm for radial basis function networks is based on first choosing randomly some data points as radial basis function centers and then using singular-value decomposition to solve for the weights of the network. Such a procedure has several drawbacks, and, in particular, an arbitrary selection of centers is clearly unsatisfactory. The authors propose an alternative learning procedure based on the orthogonal least-squares method. The procedure chooses radial basis function centers one by one in a rational way until an adequate network has been constructed. In the algorithm, each selected center maximizes the increment to the explained variance or energy of the desired output and does not suffer numerical ill-conditioning problems. The orthogonal least-squares learning strategy provides a simple and efficient means for fitting radial basis function networks. This is illustrated using examples taken from two different signal processing applications.
302-309
Cowan, C. F. N.
383ecc7c-1e78-4861-95d7-cbfeb02745a1
Grant, P. M.
e527fff4-da0f-4bc4-91cf-eed522070300
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
1 March 1991
Cowan, C. F. N.
383ecc7c-1e78-4861-95d7-cbfeb02745a1
Grant, P. M.
e527fff4-da0f-4bc4-91cf-eed522070300
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Cowan, C. F. N., Grant, P. M. and Chen, Sheng
(1991)
Orthogonal least squares learning algorithm for radial basis function networks.
IEEE Transactions on Neural Networks, 2 (2), .
Abstract
The radial basis function network offers a viable alternative to the two-layer neural network in many applications of signal processing. A common learning algorithm for radial basis function networks is based on first choosing randomly some data points as radial basis function centers and then using singular-value decomposition to solve for the weights of the network. Such a procedure has several drawbacks, and, in particular, an arbitrary selection of centers is clearly unsatisfactory. The authors propose an alternative learning procedure based on the orthogonal least-squares method. The procedure chooses radial basis function centers one by one in a rational way until an adequate network has been constructed. In the algorithm, each selected center maximizes the increment to the explained variance or energy of the desired output and does not suffer numerical ill-conditioning problems. The orthogonal least-squares learning strategy provides a simple and efficient means for fitting radial basis function networks. This is illustrated using examples taken from two different signal processing applications.
Text
TNN1991-2-2
- Author's Original
Text
00080341.pdf
- Other
Restricted to Repository staff only
Request a copy
More information
Published date: 1 March 1991
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 251135
URI: http://eprints.soton.ac.uk/id/eprint/251135
PURE UUID: a5753fbc-251d-4f5b-aed4-6557d3824153
Catalogue record
Date deposited: 12 Oct 1999
Last modified: 14 Mar 2024 05:09
Export record
Contributors
Author:
C. F. N. Cowan
Author:
P. M. Grant
Author:
Sheng Chen
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics