Nonlinear identification using orthogonal forward regression with nested optimal regularization
Nonlinear identification using orthogonal forward regression with nested optimal regularization
An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.
cross validation, forward regression, identification, leave-one-out errors, nonlinear system, regularization
2925-2936
Hong, Xia
e6551bb3-fbc0-4990-935e-43b706d8c679
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Gao, Junbin
a3dcab84-9675-402c-a19e-d41ea9973f3a
Harris, Chris J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
December 2015
Hong, Xia
e6551bb3-fbc0-4990-935e-43b706d8c679
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Gao, Junbin
a3dcab84-9675-402c-a19e-d41ea9973f3a
Harris, Chris J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
Hong, Xia, Chen, Sheng, Gao, Junbin and Harris, Chris J.
(2015)
Nonlinear identification using orthogonal forward regression with nested optimal regularization.
IEEE Transactions on Cybernetics, 45 (12), .
(doi:10.1109/TCYB.2015.2389524).
Abstract
An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.
Text
CYB2015-12.pdf
- Version of Record
Restricted to Repository staff only
Request a copy
More information
Accepted/In Press date: 6 January 2015
e-pub ahead of print date: 27 January 2015
Published date: December 2015
Keywords:
cross validation, forward regression, identification, leave-one-out errors, nonlinear system, regularization
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 384029
URI: http://eprints.soton.ac.uk/id/eprint/384029
ISSN: 2168-2267
PURE UUID: ecb8585a-d4a8-487a-ad77-c9178f7d7bca
Catalogue record
Date deposited: 07 Dec 2015 13:49
Last modified: 14 Mar 2024 21:51
Export record
Altmetrics
Contributors
Author:
Xia Hong
Author:
Sheng Chen
Author:
Junbin Gao
Author:
Chris J. Harris
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics