The University of Southampton
University of Southampton Institutional Repository

Parametric Polynomial Time Perceptron Rescaling Algorithm

Parametric Polynomial Time Perceptron Rescaling Algorithm
Parametric Polynomial Time Perceptron Rescaling Algorithm
Let us consider a linear feasibility problem with a possibly infinite number of inequality constraints posed in an on-line setting: an algorithm suggests a candidate solution, and the oracle either confirms its feasibility, or outputs a violated constraint vector. This model can be solved by subgradient optimisation algorithms for non-smooth functions, also known as the perceptron algorithms in the machine learning community, and its solvability depends on the problem dimension and the radius of the constraint set. The classical perceptron algorithm may have an exponential complexity in the worst case when the radius is infinitesimal [1]. To overcome this difficulty, the space dilation technique was exploited in the ellipsoid algorithm to make its running time polynomial [3]. A special case of the space dilation, the rescaling procedure is utilised in the perceptron rescaling algorithm [2] with a probabilistic approach to choosing the direction of dilation. A parametric version of the perceptron rescaling algorithm is the focus of this work. It is demonstrated that some fixed parameters of the latter algorithm (the initial estimate of the radius and the relaxation parameter) may be modified and adapted for particular problems. The generalised theoretical framework allows to determine convergence of the algorithm with any chosen set of values of these parameters, and suggests a potential way of decreasing the complexity of the algorithm which remains the subject of current research.
Linear programming, perceptron algorithm, subgradient descent method, online learning, oracle algorithm, space dilation.
1-904987-38-9
157-157
College Publications
Kharechko, Andriy
9dccd719-b3fd-4ff6-9b85-b329e31cba9e
Broersma, Hajo
Dantchev, Stefan
Johnson, Matthew
Szeider, Stefan
Kharechko, Andriy
9dccd719-b3fd-4ff6-9b85-b329e31cba9e
Broersma, Hajo
Dantchev, Stefan
Johnson, Matthew
Szeider, Stefan

Kharechko, Andriy (2006) Parametric Polynomial Time Perceptron Rescaling Algorithm. Broersma, Hajo, Dantchev, Stefan, Johnson, Matthew and Szeider, Stefan (eds.) In Algorithms and Complexity in Durham 2006: Proceedings of the Second ACiD Workshop. College Publications. p. 157 .

Record type: Conference or Workshop Item (Paper)

Abstract

Let us consider a linear feasibility problem with a possibly infinite number of inequality constraints posed in an on-line setting: an algorithm suggests a candidate solution, and the oracle either confirms its feasibility, or outputs a violated constraint vector. This model can be solved by subgradient optimisation algorithms for non-smooth functions, also known as the perceptron algorithms in the machine learning community, and its solvability depends on the problem dimension and the radius of the constraint set. The classical perceptron algorithm may have an exponential complexity in the worst case when the radius is infinitesimal [1]. To overcome this difficulty, the space dilation technique was exploited in the ellipsoid algorithm to make its running time polynomial [3]. A special case of the space dilation, the rescaling procedure is utilised in the perceptron rescaling algorithm [2] with a probabilistic approach to choosing the direction of dilation. A parametric version of the perceptron rescaling algorithm is the focus of this work. It is demonstrated that some fixed parameters of the latter algorithm (the initial estimate of the radius and the relaxation parameter) may be modified and adapted for particular problems. The generalised theoretical framework allows to determine convergence of the algorithm with any chosen set of values of these parameters, and suggests a potential way of decreasing the complexity of the algorithm which remains the subject of current research.

Text
acidabstract.pdf - Other
Download (28kB)
Other
acidabstract.ps - Other
Download (62kB)

More information

Published date: 2006
Additional Information: 18-20 September 2006 Commentary On: Texts in Algorithmics 7
Keywords: Linear programming, perceptron algorithm, subgradient descent method, online learning, oracle algorithm, space dilation.
Organisations: Electronics & Computer Science

Identifiers

Local EPrints ID: 263418
URI: http://eprints.soton.ac.uk/id/eprint/263418
ISBN: 1-904987-38-9
PURE UUID: 8ba2ceaf-c48c-4503-a4c8-b271ffaa6f91

Catalogue record

Date deposited: 13 Feb 2007
Last modified: 14 Mar 2024 07:31

Export record

Contributors

Author: Andriy Kharechko
Editor: Hajo Broersma
Editor: Stefan Dantchev
Editor: Matthew Johnson
Editor: Stefan Szeider

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×