The University of Southampton
University of Southampton Institutional Repository

On the Generalisation of Soft Margin Algorithms

On the Generalisation of Soft Margin Algorithms
On the Generalisation of Soft Margin Algorithms
Generalization bounds depending on the margin of a classifier are a relatively recent development. They provide an explanation of the performance of state-of-the-art learning systems such as support vector machines (SVMs) [1] and Adaboost [2]. The difficulty with these bounds has been either their lack of robustness or their looseness. The question of whether the generalization of a classifier can be more tightly bounded in terms of a robust measure of the distribution of margin values has remained open for some time. The paper answers this open question in the affirmative and, furthermore, the analysis leads to bounds that motivate the previously heuristic soft margin SVM algorithms as well as justifying the use of the quadratic loss in neural network training algorithms. The results are extended to give bounds for the probability of failing to achieve a target accuracy in regression prediction, with a statistical analysis of ridge regression and Gaussian processes as a special case. The analysis presented in the paper has also lead to new boosting algorithms described elsewhere.
0018-9448
2721-2735
Shawe-Taylor, J.
c32d0ee4-b422-491f-8c28-78663851d6db
Cristianini, N.
00885da7-7833-4f0c-b8a0-3f385d89f642
Shawe-Taylor, J.
c32d0ee4-b422-491f-8c28-78663851d6db
Cristianini, N.
00885da7-7833-4f0c-b8a0-3f385d89f642

Shawe-Taylor, J. and Cristianini, N. (2002) On the Generalisation of Soft Margin Algorithms. IEEE Transactions on Information Theory, 48 (10), 2721-2735.

Record type: Article

Abstract

Generalization bounds depending on the margin of a classifier are a relatively recent development. They provide an explanation of the performance of state-of-the-art learning systems such as support vector machines (SVMs) [1] and Adaboost [2]. The difficulty with these bounds has been either their lack of robustness or their looseness. The question of whether the generalization of a classifier can be more tightly bounded in terms of a robust measure of the distribution of margin values has remained open for some time. The paper answers this open question in the affirmative and, furthermore, the analysis leads to bounds that motivate the previously heuristic soft margin SVM algorithms as well as justifying the use of the quadratic loss in neural network training algorithms. The results are extended to give bounds for the probability of failing to achieve a target accuracy in regression prediction, with a statistical analysis of ridge regression and Gaussian processes as a special case. The analysis presented in the paper has also lead to new boosting algorithms described elsewhere.

Other
main.ps - Other
Download (402kB)

More information

Published date: October 2002
Organisations: Electronics & Computer Science

Identifiers

Local EPrints ID: 259008
URI: http://eprints.soton.ac.uk/id/eprint/259008
ISSN: 0018-9448
PURE UUID: 0077873c-dc61-46bd-b373-587885642273

Catalogue record

Date deposited: 05 Mar 2004
Last modified: 14 Mar 2024 06:17

Export record

Contributors

Author: J. Shawe-Taylor
Author: N. Cristianini

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×