Sparse modelling using orthogonal forward regression with PRESS statistic and regularization
Sparse modelling using orthogonal forward regression with PRESS statistic and regularization
The paper introduces an efficient construction algorithm for obtaining sparse linear-in-the-weights regression models based on an approach of directly optimizing model generalization capability. This is achieved by utilizing the delete-1 cross validation concept and the associated leave-one-out test error also known as the PRESS (Predicted REsidual Sums of Squares) statistic, without resorting to any other validation data set for model evaluation in the model construction process. Computational efficiency is ensured using an orthogonal forward regression, but the algorithm incrementally minimizes the PRESS statistic, instead of the usual sum of the squared training errors. A local regularization method can naturally be incorporated into the model selection procedure to further enforce model sparsity. The proposed algorithm is fully automatic and the user is not required to specify any criterion to terminate the model construction procedure. Comparisons with some of the existing state-of-art modeling methods are given, and several examples are included to demonstrate the ability of the proposed algorithm to effectively construct sparse models that generalize well.
898-911
Chen, S.
9310a111-f79a-48b8-98c7-383ca93cbb80
Hong, X.
b8f251c3-e142-4555-a54c-c504de966b03
Harris, C.J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
Sharkey, P.M.
9fc73c67-d37a-4544-b18f-470323d880e7
April 2004
Chen, S.
9310a111-f79a-48b8-98c7-383ca93cbb80
Hong, X.
b8f251c3-e142-4555-a54c-c504de966b03
Harris, C.J.
c4fd3763-7b3f-4db1-9ca3-5501080f797a
Sharkey, P.M.
9fc73c67-d37a-4544-b18f-470323d880e7
Chen, S., Hong, X., Harris, C.J. and Sharkey, P.M.
(2004)
Sparse modelling using orthogonal forward regression with PRESS statistic and regularization.
IEEE Trans. Systems, Man and Cybernetics, Part B, 34 (2), .
Abstract
The paper introduces an efficient construction algorithm for obtaining sparse linear-in-the-weights regression models based on an approach of directly optimizing model generalization capability. This is achieved by utilizing the delete-1 cross validation concept and the associated leave-one-out test error also known as the PRESS (Predicted REsidual Sums of Squares) statistic, without resorting to any other validation data set for model evaluation in the model construction process. Computational efficiency is ensured using an orthogonal forward regression, but the algorithm incrementally minimizes the PRESS statistic, instead of the usual sum of the squared training errors. A local regularization method can naturally be incorporated into the model selection procedure to further enforce model sparsity. The proposed algorithm is fully automatic and the user is not required to specify any criterion to terminate the model construction procedure. Comparisons with some of the existing state-of-art modeling methods are given, and several examples are included to demonstrate the ability of the proposed algorithm to effectively construct sparse models that generalize well.
Other
ofrPRESS.ps
- Other
Text
01275524.pdf
- Other
More information
Published date: April 2004
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 259231
URI: http://eprints.soton.ac.uk/id/eprint/259231
PURE UUID: 579cb24f-c1ed-4745-a35b-3931ceb24b27
Catalogue record
Date deposited: 31 Mar 2004
Last modified: 14 Mar 2024 06:22
Export record
Contributors
Author:
S. Chen
Author:
X. Hong
Author:
C.J. Harris
Author:
P.M. Sharkey
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics