The University of Southampton
University of Southampton Institutional Repository

Benchmarking least squares support vector machine classifiers

Benchmarking least squares support vector machine classifiers
Benchmarking least squares support vector machine classifiers
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of equations in the dual space. While the SVM classifier has a large margin interpretation, the LS-SVM formulation is related in this paper to a ridge regression approach for classification with binary targets and to Fisher's linear discriminant analysis in the feature space. Multiclass categorization problems are represented by a set of binary classifiers using different output coding schemes. While regularization is used to control the effective number of parameters of the LS-SVM classifier, the sparseness property of SVMs is lost due to the choice of the 2-norm. Sparseness can be imposed in a second stage by gradually pruning the support value spectrum and optimizing the hyperparameters during the sparse approximation procedure. In this paper, twenty public domain benchmark datasets are used to evaluate the test set performance of LS-SVM classifiers with linear, polynomial and radial basis function (RBF) kernels. Both the SVM and LS-SVM classifier with RBF kernel in combination with standard cross-validation procedures for hyperparameter selection achieve comparable test set performances. These SVM and LS-SVM performances are consistently very good when compared to a variety of methods described in the literature including decision tree based algorithms, statistical algorithms and instance based learning methods. We show on ten UCI datasets that the LS-SVM sparse approximation procedure can be successfully applied.
least squares support vector machines, multiclass support vector machines, sparse approximation
5-32
Van Gestel,, Tony
6eb59cab-a571-4274-b34f-08700a08344e
Suykens, Johan A.K.
bdd3e2bd-0bed-4f9e-bb07-8d2e24e3b7a8
Baesens, Bart
f7c6496b-aa7f-4026-8616-ca61d9e216f0
Viaene, Stijn
e4f8934b-ddb8-44da-b381-fd54bf99e274
Vanthienen, Jan
6f3d818f-0fce-46fa-966b-160e645caf6d
Dedene, Guido
de15fcda-ec48-47e2-bf1e-e882ab48061c
De Moor,, Bart
50cffa28-59f0-4b87-b8ea-3798865cc144
Vande, Joos
06c7733c-1b77-4805-9938-0fae1c124aa8
Van Gestel,, Tony
6eb59cab-a571-4274-b34f-08700a08344e
Suykens, Johan A.K.
bdd3e2bd-0bed-4f9e-bb07-8d2e24e3b7a8
Baesens, Bart
f7c6496b-aa7f-4026-8616-ca61d9e216f0
Viaene, Stijn
e4f8934b-ddb8-44da-b381-fd54bf99e274
Vanthienen, Jan
6f3d818f-0fce-46fa-966b-160e645caf6d
Dedene, Guido
de15fcda-ec48-47e2-bf1e-e882ab48061c
De Moor,, Bart
50cffa28-59f0-4b87-b8ea-3798865cc144
Vande, Joos
06c7733c-1b77-4805-9938-0fae1c124aa8

Van Gestel,, Tony, Suykens, Johan A.K., Baesens, Bart, Viaene, Stijn, Vanthienen, Jan, Dedene, Guido, De Moor,, Bart and Vande, Joos (2004) Benchmarking least squares support vector machine classifiers. Machine Learning, 54 (1), 5-32. (doi:10.1023/B:MACH.0000008082.80494.e0).

Record type: Article

Abstract

In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of equations in the dual space. While the SVM classifier has a large margin interpretation, the LS-SVM formulation is related in this paper to a ridge regression approach for classification with binary targets and to Fisher's linear discriminant analysis in the feature space. Multiclass categorization problems are represented by a set of binary classifiers using different output coding schemes. While regularization is used to control the effective number of parameters of the LS-SVM classifier, the sparseness property of SVMs is lost due to the choice of the 2-norm. Sparseness can be imposed in a second stage by gradually pruning the support value spectrum and optimizing the hyperparameters during the sparse approximation procedure. In this paper, twenty public domain benchmark datasets are used to evaluate the test set performance of LS-SVM classifiers with linear, polynomial and radial basis function (RBF) kernels. Both the SVM and LS-SVM classifier with RBF kernel in combination with standard cross-validation procedures for hyperparameter selection achieve comparable test set performances. These SVM and LS-SVM performances are consistently very good when compared to a variety of methods described in the literature including decision tree based algorithms, statistical algorithms and instance based learning methods. We show on ten UCI datasets that the LS-SVM sparse approximation procedure can be successfully applied.

This record has no associated files available for download.

More information

Published date: 2004
Keywords: least squares support vector machines, multiclass support vector machines, sparse approximation

Identifiers

Local EPrints ID: 36519
URI: http://eprints.soton.ac.uk/id/eprint/36519
PURE UUID: c2c69dfe-eafd-4bf7-85ef-0a82013ce17c
ORCID for Bart Baesens: ORCID iD orcid.org/0000-0002-5831-5668

Catalogue record

Date deposited: 23 May 2006
Last modified: 16 Mar 2024 03:39

Export record

Altmetrics

Contributors

Author: Tony Van Gestel,
Author: Johan A.K. Suykens
Author: Bart Baesens ORCID iD
Author: Stijn Viaene
Author: Jan Vanthienen
Author: Guido Dedene
Author: Bart De Moor,
Author: Joos Vande

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×