Automatic kernel regression modelling using combined leave-one-out test score and regularised orthogonal least squares
Automatic kernel regression modelling using combined leave-one-out test score and regularised orthogonal least squares
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out test score also known as the PRESS (Predicted REsidual Sums of Squares) statistic and regularised orthogonal least squares. The proposed algorithm aims to achieve maximised model robustness via two effective and complementary approaches, parameter regularisation via ridge regression and model optimal generalisation structure selection. The major contributions are to derive the PRESS error in a regularised orthogonal weight model, develop an efficient recursive computation formula for PRESS errors in the regularised orthogonal least squares forward regression framework and hence construct a model with a good generalisation property. Based on the properties of the PRESS statistic the proposed algorithm can achieve a fully automated model construction procedure without resort to any other validation data set for model evaluation.
27-37
Hong, X.
b8f251c3-e142-4555-a54c-c504de966b03
Chen, S.
9310a111-f79a-48b8-98c7-383ca93cbb80
Sharkey, P.M.
9fc73c67-d37a-4544-b18f-470323d880e7
February 2004
Hong, X.
b8f251c3-e142-4555-a54c-c504de966b03
Chen, S.
9310a111-f79a-48b8-98c7-383ca93cbb80
Sharkey, P.M.
9fc73c67-d37a-4544-b18f-470323d880e7
Hong, X., Chen, S. and Sharkey, P.M.
(2004)
Automatic kernel regression modelling using combined leave-one-out test score and regularised orthogonal least squares.
International Journal Neural Systems, 14 (1), .
Abstract
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out test score also known as the PRESS (Predicted REsidual Sums of Squares) statistic and regularised orthogonal least squares. The proposed algorithm aims to achieve maximised model robustness via two effective and complementary approaches, parameter regularisation via ridge regression and model optimal generalisation structure selection. The major contributions are to derive the PRESS error in a regularised orthogonal weight model, develop an efficient recursive computation formula for PRESS errors in the regularised orthogonal least squares forward regression framework and hence construct a model with a good generalisation property. Based on the properties of the PRESS statistic the proposed algorithm can achieve a fully automated model construction procedure without resort to any other validation data set for model evaluation.
More information
Published date: February 2004
Additional Information:
accepted for publication in September 2003
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 259058
URI: http://eprints.soton.ac.uk/id/eprint/259058
PURE UUID: de144dd1-dc84-42ac-a13c-f8c1bebd54bb
Catalogue record
Date deposited: 11 Mar 2004
Last modified: 14 Mar 2024 06:18
Export record
Contributors
Author:
X. Hong
Author:
S. Chen
Author:
P.M. Sharkey
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics