The University of Southampton
University of Southampton Institutional Repository

A stochastic gradient method with biased estimation for faster nonconvex optimization

A stochastic gradient method with biased estimation for faster nonconvex optimization
A stochastic gradient method with biased estimation for faster nonconvex optimization
A number of optimization approaches have been proposed for optimizing nonconvex objectives (e.g. deep learning models), such as batch gradient descent, stochastic gradient descent and stochastic variance reduced gradient descent. Theory shows these optimization methods can converge by using an unbiased gradient estimator. However, in practice biased gradient estimation can allow more efficient convergence to the vicinity since an unbiased approach is computationally more expensive. To produce fast convergence there are two trade-offs of these optimization strategies which are between stochastic/batch, and between biased/unbiased. This paper proposes an integrated approach which can control the nature of the stochastic element in the optimizer and can balance the trade-off of estimator between the biased and unbiased by using a hyper-parameter. It is shown theoretically and experimentally that this hyper-parameter can be configured to provide an effective balance to improve the convergence rate.
Deep learning, Optimisation
337-349
Springer, Cham
Bi, Jia
e07a78d1-62dd-4b1d-b223-4107aa3627c7
Gunn, Steve R.
306af9b3-a7fa-4381-baf9-5d6a6ec89868
Nayak, A.
Sharma, A.
Bi, Jia
e07a78d1-62dd-4b1d-b223-4107aa3627c7
Gunn, Steve R.
306af9b3-a7fa-4381-baf9-5d6a6ec89868
Nayak, A.
Sharma, A.

Bi, Jia and Gunn, Steve R. (2019) A stochastic gradient method with biased estimation for faster nonconvex optimization. Nayak, A. and Sharma, A. (eds.) In PRICAI 2019: Trends in Artificial Intelligence. vol. 11671, Springer, Cham. pp. 337-349 . (doi:10.1007/978-3-030-29911-8_26).

Record type: Conference or Workshop Item (Paper)

Abstract

A number of optimization approaches have been proposed for optimizing nonconvex objectives (e.g. deep learning models), such as batch gradient descent, stochastic gradient descent and stochastic variance reduced gradient descent. Theory shows these optimization methods can converge by using an unbiased gradient estimator. However, in practice biased gradient estimation can allow more efficient convergence to the vicinity since an unbiased approach is computationally more expensive. To produce fast convergence there are two trade-offs of these optimization strategies which are between stochastic/batch, and between biased/unbiased. This paper proposes an integrated approach which can control the nature of the stochastic element in the optimizer and can balance the trade-off of estimator between the biased and unbiased by using a hyper-parameter. It is shown theoretically and experimentally that this hyper-parameter can be configured to provide an effective balance to improve the convergence rate.

Text
IJCAI19 - Accepted Manuscript
Download (436kB)

More information

Submitted date: 8 December 2018
e-pub ahead of print date: 23 August 2019
Published date: 2019
Venue - Dates: 2019 International Joint Conference on Artificial Intelligence, Macao, China, 2019-08-10 - 2019-08-16
Keywords: Deep learning, Optimisation

Identifiers

Local EPrints ID: 430899
URI: http://eprints.soton.ac.uk/id/eprint/430899
PURE UUID: 9e79ca99-ec59-4ff3-aced-2458b9f8d701

Catalogue record

Date deposited: 17 May 2019 16:30
Last modified: 23 Aug 2020 04:01

Export record

Altmetrics

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×