The University of Southampton
University of Southampton Institutional Repository

Implicit Feature Selection with the Value Difference Metric

Implicit Feature Selection with the Value Difference Metric
Implicit Feature Selection with the Value Difference Metric
The nearest neighbour paradigm provides an effective approach to supervised learning. However, it is especially susceptible to the presence of irrelevant attributes. Whilst many approaches have been proposed that select only the most relevant attributes within a data set, these approaches involve pre-processing the data in some way, and can often be computationally complex. The Value Difference Metric (VDM) is a symbolic distance metric used by a number of different nearest neighbour learning algorithms. This paper demonstrates how the VDM can be used to reduce the impact of irrelevant attributes on classification accuracy without the need for pre-processing the data. We illustrate how this metric uses simple probabilistic techniques to weight features in the instance space, and then apply this weighting technique to an alternative symbolic distance metric. The resulting distance metrics are compared in terms of classification accuracy, on a number of real-world and artificial data sets.
450-454
Payne, Terry R.
0bb13d45-2735-45a3-b72c-472fddbd0bb4
Edwards, Peter
5ee73a94-75a0-426f-ab1b-ce918b06a1ea
Payne, Terry R.
0bb13d45-2735-45a3-b72c-472fddbd0bb4
Edwards, Peter
5ee73a94-75a0-426f-ab1b-ce918b06a1ea

Payne, Terry R. and Edwards, Peter (1998) Implicit Feature Selection with the Value Difference Metric. European Conference on Artificial Intelligence, Brighton, United Kingdom. pp. 450-454 .

Record type: Conference or Workshop Item (Paper)

Abstract

The nearest neighbour paradigm provides an effective approach to supervised learning. However, it is especially susceptible to the presence of irrelevant attributes. Whilst many approaches have been proposed that select only the most relevant attributes within a data set, these approaches involve pre-processing the data in some way, and can often be computationally complex. The Value Difference Metric (VDM) is a symbolic distance metric used by a number of different nearest neighbour learning algorithms. This paper demonstrates how the VDM can be used to reduce the impact of irrelevant attributes on classification accuracy without the need for pre-processing the data. We illustrate how this metric uses simple probabilistic techniques to weight features in the instance space, and then apply this weighting technique to an alternative symbolic distance metric. The resulting distance metrics are compared in terms of classification accuracy, on a number of real-world and artificial data sets.

Text
ecai98.pdf - Other
Download (139kB)

More information

Published date: 1998
Venue - Dates: European Conference on Artificial Intelligence, Brighton, United Kingdom, 1998-01-01
Organisations: Electronics & Computer Science

Identifiers

Local EPrints ID: 262997
URI: http://eprints.soton.ac.uk/id/eprint/262997
PURE UUID: 41123119-5afa-46b5-b85e-787c2ee12d19

Catalogue record

Date deposited: 20 Sep 2006
Last modified: 14 Mar 2024 07:23

Export record

Contributors

Author: Terry R. Payne
Author: Peter Edwards

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×