Robust nonlinear model identification methods using forward regression
Robust nonlinear model identification methods using forward regression
In this correspondence new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness via combined parameter regularization and new robust structural selective criteria. In parallel to parameter regularization, we use two classes of robust model selection criteria based on either experimental design criteria that optimizes model adequacy, or the predicted residual sums of squares (PRESS) statistic that optimizes model generalization capability, respectively. Three robust identification algorithms are introduced, i.e., combined A- and D-optimality with regularized orthogonal least squares algorithm, respectively; and combined PRESS statistic with regularized orthogonal least squares algorithm. A common characteristic of these algorithms is that the inherent computation efficiency associated with the orthogonalization scheme in orthogonal least squares or regularized orthogonal least squares has been extended such that the new algorithms are computationally efficient. Numerical examples are included to demonstrate effectiveness of the algorithms.
514-523
Hong, X.
b8f251c3-e142-4555-a54c-c504de966b03
Harris, C.J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
Chen, S.
9310a111-f79a-48b8-98c7-383ca93cbb80
Sharkey, P.M.
9fc73c67-d37a-4544-b18f-470323d880e7
1 July 2003
Hong, X.
b8f251c3-e142-4555-a54c-c504de966b03
Harris, C.J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
Chen, S.
9310a111-f79a-48b8-98c7-383ca93cbb80
Sharkey, P.M.
9fc73c67-d37a-4544-b18f-470323d880e7
Hong, X., Harris, C.J., Chen, S. and Sharkey, P.M.
(2003)
Robust nonlinear model identification methods using forward regression.
IEEE Transaction on Systems, Man and Cybernetics, Part A, 33 (4), .
Abstract
In this correspondence new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness via combined parameter regularization and new robust structural selective criteria. In parallel to parameter regularization, we use two classes of robust model selection criteria based on either experimental design criteria that optimizes model adequacy, or the predicted residual sums of squares (PRESS) statistic that optimizes model generalization capability, respectively. Three robust identification algorithms are introduced, i.e., combined A- and D-optimality with regularized orthogonal least squares algorithm, respectively; and combined PRESS statistic with regularized orthogonal least squares algorithm. A common characteristic of these algorithms is that the inherent computation efficiency associated with the orthogonalization scheme in orthogonal least squares or regularized orthogonal least squares has been extended such that the new algorithms are computationally efficient. Numerical examples are included to demonstrate effectiveness of the algorithms.
Text
01235984.pdf
- Author's Original
Text
01235984
- Author's Original
More information
Published date: 1 July 2003
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 258340
URI: http://eprints.soton.ac.uk/id/eprint/258340
PURE UUID: ce71a676-dcfe-48c6-b526-d408072473af
Catalogue record
Date deposited: 15 Oct 2003
Last modified: 14 Mar 2024 06:07
Export record
Contributors
Author:
X. Hong
Author:
C.J. Harris
Author:
S. Chen
Author:
P.M. Sharkey
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics