Parametric Polynomial Time Perceptron Rescaling Algorithm

Kharechko, Andriy (2006) Parametric Polynomial Time Perceptron Rescaling Algorithm. In, Broersma, Hajo, Dantchev, Stefan, Johnson, Matthew and Szeider, Stefan (eds.) Algorithms and Complexity in Durham 2006: Proceedings of the Second ACiD Workshop. , College Publications, King's College, London, UK, 157-157.


[img] PDF
Download (27Kb)
[img] Postscript
Download (60Kb)


Let us consider a linear feasibility problem with a possibly infinite number of inequality constraints posed in an on-line setting: an algorithm suggests a candidate solution, and the oracle either confirms its feasibility, or outputs a violated constraint vector. This model can be solved by subgradient optimisation algorithms for non-smooth functions, also known as the perceptron algorithms in the machine learning community, and its solvability depends on the problem dimension and the radius of the constraint set. The classical perceptron algorithm may have an exponential complexity in the worst case when the radius is infinitesimal [1]. To overcome this difficulty, the space dilation technique was exploited in the ellipsoid algorithm to make its running time polynomial [3]. A special case of the space dilation, the rescaling procedure is utilised in the perceptron rescaling algorithm [2] with a probabilistic approach to choosing the direction of dilation. A parametric version of the perceptron rescaling algorithm is the focus of this work. It is demonstrated that some fixed parameters of the latter algorithm (the initial estimate of the radius and the relaxation parameter) may be modified and adapted for particular problems. The generalised theoretical framework allows to determine convergence of the algorithm with any chosen set of values of these parameters, and suggests a potential way of decreasing the complexity of the algorithm which remains the subject of current research.

Item Type: Book Section
Additional Information: 18-20 September 2006 Commentary On: Texts in Algorithmics 7
ISBNs: 1904987389
Keywords: Linear programming, perceptron algorithm, subgradient descent method, online learning, oracle algorithm, space dilation.
Divisions : Faculty of Physical Sciences and Engineering > Electronics and Computer Science
ePrint ID: 263418
Accepted Date and Publication Date:
Date Deposited: 13 Feb 2007
Last Modified: 27 Mar 2014 20:07
Further Information:Google Scholar

Actions (login required)

View Item View Item

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics