The University of Southampton
University of Southampton Institutional Repository

Subspace Newton method for the l0-regularized optimization

Subspace Newton method for the l0-regularized optimization
Subspace Newton method for the l0-regularized optimization
Sparse optimization has seen an evolutionary advance in the past decade with extensive applications ranging from image and signal processing, statistics to machine learning. As a tractable approach, regularization is frequently used, leading to a regularized optimization where l0 norm or its continuous approximations that characterize the sparsity are punished in its objective. From the continuity of approximations to the discreteness of l0 norm, the most challenging model is the l0-regularized optimization. To conquer its hardness, numerous numerically effective methods have been proposed. However, most of them only enjoy that the (sub)sequence converges to a stationary point from the deterministic optimization perspective or the distance between each iterate and any given sparse reference point is bounded by an error bound in the sense of probability. We design a method SNL0: subspace Newton method for the l0-regularized optimization, and prove that its generated sequence converges to a stationary point globally under the strong smoothness condition. In addition, it is also quadratic convergent with the help of locally strong convexity, well explaining that our method, as a second order method, is able to converge very fast. Moreover, a novel mechanism to effectively update the penalty parameter is created, which allows us to get rid of the tedious parameter tuning task that is suffered by most regularized optimization methods.
Zhou, Shenglong
d183edc9-a9f6-4b07-a140-a82213dbd8c3
Pan, Lili
432e64ee-1662-440e-88e7-de32dddf453e
Xiu, Naihua
8b5770f7-ae35-4dbe-884a-02fb4ea27bee
Zhou, Shenglong
d183edc9-a9f6-4b07-a140-a82213dbd8c3
Pan, Lili
432e64ee-1662-440e-88e7-de32dddf453e
Xiu, Naihua
8b5770f7-ae35-4dbe-884a-02fb4ea27bee

Zhou, Shenglong, Pan, Lili and Xiu, Naihua (2020) Subspace Newton method for the l0-regularized optimization. arXiv, (2004.05132). (In Press)

Record type: Article

Abstract

Sparse optimization has seen an evolutionary advance in the past decade with extensive applications ranging from image and signal processing, statistics to machine learning. As a tractable approach, regularization is frequently used, leading to a regularized optimization where l0 norm or its continuous approximations that characterize the sparsity are punished in its objective. From the continuity of approximations to the discreteness of l0 norm, the most challenging model is the l0-regularized optimization. To conquer its hardness, numerous numerically effective methods have been proposed. However, most of them only enjoy that the (sub)sequence converges to a stationary point from the deterministic optimization perspective or the distance between each iterate and any given sparse reference point is bounded by an error bound in the sense of probability. We design a method SNL0: subspace Newton method for the l0-regularized optimization, and prove that its generated sequence converges to a stationary point globally under the strong smoothness condition. In addition, it is also quadratic convergent with the help of locally strong convexity, well explaining that our method, as a second order method, is able to converge very fast. Moreover, a novel mechanism to effectively update the penalty parameter is created, which allows us to get rid of the tedious parameter tuning task that is suffered by most regularized optimization methods.

Text
Subspace-Newton-L0 - Accepted Manuscript
Download (1MB)

More information

Accepted/In Press date: 10 April 2020

Identifiers

Local EPrints ID: 439547
URI: http://eprints.soton.ac.uk/id/eprint/439547
PURE UUID: d5ab9dc2-9207-439f-a465-4a2f0dec3593
ORCID for Shenglong Zhou: ORCID iD orcid.org/0000-0003-2843-1614

Catalogue record

Date deposited: 27 Apr 2020 16:30
Last modified: 16 Mar 2024 07:35

Export record

Contributors

Author: Shenglong Zhou ORCID iD
Author: Lili Pan
Author: Naihua Xiu

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×