A forward-constrained regression algorithm for sparse kernel density estimation
A forward-constrained regression algorithm for sparse kernel density estimation
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss–Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.
193-198
Hong, Xia
e2869895-2015-4f79-a624-f994027ed12a
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Harris, Chris J.
dc305347-9cb2-4621-b42f-3f9950116e0d
1 January 2008
Hong, Xia
e2869895-2015-4f79-a624-f994027ed12a
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Harris, Chris J.
dc305347-9cb2-4621-b42f-3f9950116e0d
Hong, Xia, Chen, Sheng and Harris, Chris J.
(2008)
A forward-constrained regression algorithm for sparse kernel density estimation.
IEEE Transactions on Neural Networks, 19 (1), .
(doi:10.1109/TNN.2007.908645).
Abstract
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss–Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.
Text
TNN2008-19-1
- Version of Record
Restricted to Repository staff only
Request a copy
Text
04378278.pdf
- Other
Restricted to Repository staff only
Request a copy
More information
Published date: 1 January 2008
Organisations:
Southampton Wireless Group
Identifiers
Local EPrints ID: 265038
URI: http://eprints.soton.ac.uk/id/eprint/265038
PURE UUID: fed9e170-b5b9-4235-87f1-936c4d47acde
Catalogue record
Date deposited: 15 Jan 2008 12:12
Last modified: 14 Mar 2024 08:01
Export record
Altmetrics
Contributors
Author:
Xia Hong
Author:
Sheng Chen
Author:
Chris J. Harris
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics