Sparse model construction using coordinate descent optimization
Sparse model construction using coordinate descent optimization
We propose a new sparse model construction method aimed at maximizing a model’s generalisation capability for a large class of linear-in-the-parameters models. The coordinate descent optimization algorithm is employed with a modified l1-penalized least squares cost function in order to estimate a single parameter and its regularization parameter simultaneously based on the leave one out mean square error (LOOMSE). Our original contribution is to derive a closed form of optimal LOOMSE regularization parameter for a single term model, for which we show that the LOOMSE can be analytically computed without actually splitting the data set leading to a very simple parameter estimation method. We then integrate the new results within the coordinate descent optimization algorithm to update model parameters one at the time for linear-in-the-parameters models. Consequently a fully automated procedure is achieved without resort to any other validation data set for iterative model evaluation. Illustrative examples are included to demonstrate the effectiveness of the new approaches
Hong, Xia
e6551bb3-fbc0-4990-935e-43b706d8c679
Guo, Yi
8282665c-5cfc-4cea-80f7-300892271c23
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Gao, Junbin
a3dcab84-9675-402c-a19e-d41ea9973f3a
July 2013
Hong, Xia
e6551bb3-fbc0-4990-935e-43b706d8c679
Guo, Yi
8282665c-5cfc-4cea-80f7-300892271c23
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Gao, Junbin
a3dcab84-9675-402c-a19e-d41ea9973f3a
Hong, Xia, Guo, Yi, Chen, Sheng and Gao, Junbin
(2013)
Sparse model construction using coordinate descent optimization.
18th International Conference on Signal Processing, Santorini, Greece.
01 - 03 Jul 2013.
6 pp
.
Record type:
Conference or Workshop Item
(Paper)
Abstract
We propose a new sparse model construction method aimed at maximizing a model’s generalisation capability for a large class of linear-in-the-parameters models. The coordinate descent optimization algorithm is employed with a modified l1-penalized least squares cost function in order to estimate a single parameter and its regularization parameter simultaneously based on the leave one out mean square error (LOOMSE). Our original contribution is to derive a closed form of optimal LOOMSE regularization parameter for a single term model, for which we show that the LOOMSE can be analytically computed without actually splitting the data set leading to a very simple parameter estimation method. We then integrate the new results within the coordinate descent optimization algorithm to update model parameters one at the time for linear-in-the-parameters models. Consequently a fully automated procedure is achieved without resort to any other validation data set for iterative model evaluation. Illustrative examples are included to demonstrate the effectiveness of the new approaches
Text
DSP2013-sm-cdo.pdf
- Version of Record
More information
Published date: July 2013
Venue - Dates:
18th International Conference on Signal Processing, Santorini, Greece, 2013-07-01 - 2013-07-03
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 354096
URI: http://eprints.soton.ac.uk/id/eprint/354096
PURE UUID: 5d47b1d3-039a-4e41-b33f-2687a44ffecf
Catalogue record
Date deposited: 01 Jul 2013 10:58
Last modified: 14 Mar 2024 14:13
Export record
Contributors
Author:
Xia Hong
Author:
Yi Guo
Author:
Sheng Chen
Author:
Junbin Gao
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics