The University of Southampton
University of Southampton Institutional Repository

Wrapped feature selection by means of guided neural network optimisation

Wrapped feature selection by means of guided neural network optimisation
Wrapped feature selection by means of guided neural network optimisation
In this paper, we discuss the implementation of a wrapped neural network feature selection approach, introduced here as the Weight Cascaded Retraining (WCR) algorithm. The first part of the paper provides an outline of the algorithm and elaborates on its formal underpinnings. Central to the whole feature pruning approach is the iteratively conceived guided function optimization realized by passing the optimized weight vector from one iteration step to the next. This essentially gives rise to a cascaded form of neural network retraining. In the second part of the paper the theoretical exposition of the WCR algorithm will be illuminated and benchmarked by means of publicly available UCI case material. It is illustrated that WCR based neural network feature selection may be very effective in reducing model complexity for classification modeling via neural networks.
Baesens, B.
f7c6496b-aa7f-4026-8616-ca61d9e216f0
Viaene, S.
68e01ebc-8a4d-4460-86bf-18711fcee8d6
Vanthienen, J.
808131f1-b77b-4dee-bdda-90a94124c999
Dedene, G.
8afb894b-10b1-48d9-8751-4fab7a71b8ca
Baesens, B.
f7c6496b-aa7f-4026-8616-ca61d9e216f0
Viaene, S.
68e01ebc-8a4d-4460-86bf-18711fcee8d6
Vanthienen, J.
808131f1-b77b-4dee-bdda-90a94124c999
Dedene, G.
8afb894b-10b1-48d9-8751-4fab7a71b8ca

Baesens, B., Viaene, S., Vanthienen, J. and Dedene, G. (2000) Wrapped feature selection by means of guided neural network optimisation. The Fifteenth International Conference on Pattern Recognition (ICPR'2000). 01 Sep 2000.

Record type: Conference or Workshop Item (Paper)

Abstract

In this paper, we discuss the implementation of a wrapped neural network feature selection approach, introduced here as the Weight Cascaded Retraining (WCR) algorithm. The first part of the paper provides an outline of the algorithm and elaborates on its formal underpinnings. Central to the whole feature pruning approach is the iteratively conceived guided function optimization realized by passing the optimized weight vector from one iteration step to the next. This essentially gives rise to a cascaded form of neural network retraining. In the second part of the paper the theoretical exposition of the WCR algorithm will be illuminated and benchmarked by means of publicly available UCI case material. It is illustrated that WCR based neural network feature selection may be very effective in reducing model complexity for classification modeling via neural networks.

Full text not available from this repository.

More information

Published date: 2000
Venue - Dates: The Fifteenth International Conference on Pattern Recognition (ICPR'2000), 2000-09-01 - 2000-09-01

Identifiers

Local EPrints ID: 36756
URI: http://eprints.soton.ac.uk/id/eprint/36756
PURE UUID: b3956e1c-380a-40e0-8f0a-2f0908ebf3ac
ORCID for B. Baesens: ORCID iD orcid.org/0000-0002-5831-5668

Catalogue record

Date deposited: 31 Jul 2006
Last modified: 05 Nov 2019 01:49

Export record

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×