The University of Southampton
University of Southampton Institutional Repository
Warning ePrints Soton is experiencing an issue with some file downloads not being available. We are working hard to fix this. Please bear with us.

Global and quadratic convergence of Newton hard-thresholding pursuit

Global and quadratic convergence of Newton hard-thresholding pursuit
Global and quadratic convergence of Newton hard-thresholding pursuit
Algorithms based on the hard thresholding principle have been well studied with sounding theoretical guarantees in the compressed sensing and more general sparsity-constrained optimization. It is widely observed in existing empirical studies that when a restricted Newton step was used (as the debiasing step), the hard-thresholding algorithms tend to meet halting conditions in a significantly low number of iterations and hence are very efficient. However, the thus obtained Newton hard-thresholding algorithms do not offer any better theoretical guarantees than their simple hard-thresholding counterparts. This annoying discrepancy between theory and empirical studies has been known for some time. This paper provides a theoretical justification for the use of the restricted Newton step. We build our theory and algorithm, Newton Hard-Thresholding Pursuit ( NHTP), for the sparsity-constrained optimization. Our main result shows that NHTP is quadratically convergent under the standard assumption of restricted strong convexity and smoothness. We also establish its global convergence to a stationary point under a weaker assumption. In the special case of the compressive sensing, NHTP eventually reduces to some existing hard-thresholding algorithms with a Newton step. Consequently, our fast convergence result justifies why those algorithms perform better than without the Newton step. The efficiency of NHTP was demonstrated on both synthetic and real data in compressed sensing and sparse logistic regression.
Global convergence, Hard thresholding, Newton's method, Quadratic convergence rate, Sparse optimization, Stationary point
1-45
Zhou, Shenglong
d183edc9-a9f6-4b07-a140-a82213dbd8c3
Xiu, Naihua
8b5770f7-ae35-4dbe-884a-02fb4ea27bee
Qi, Hou-Duo
e9789eb9-c2bc-4b63-9acb-c7e753cc9a85
Zhou, Shenglong
d183edc9-a9f6-4b07-a140-a82213dbd8c3
Xiu, Naihua
8b5770f7-ae35-4dbe-884a-02fb4ea27bee
Qi, Hou-Duo
e9789eb9-c2bc-4b63-9acb-c7e753cc9a85

Zhou, Shenglong, Xiu, Naihua and Qi, Hou-Duo (2021) Global and quadratic convergence of Newton hard-thresholding pursuit. Journal of Machine Learning Research, 22, 1-45, [12].

Record type: Article

Abstract

Algorithms based on the hard thresholding principle have been well studied with sounding theoretical guarantees in the compressed sensing and more general sparsity-constrained optimization. It is widely observed in existing empirical studies that when a restricted Newton step was used (as the debiasing step), the hard-thresholding algorithms tend to meet halting conditions in a significantly low number of iterations and hence are very efficient. However, the thus obtained Newton hard-thresholding algorithms do not offer any better theoretical guarantees than their simple hard-thresholding counterparts. This annoying discrepancy between theory and empirical studies has been known for some time. This paper provides a theoretical justification for the use of the restricted Newton step. We build our theory and algorithm, Newton Hard-Thresholding Pursuit ( NHTP), for the sparsity-constrained optimization. Our main result shows that NHTP is quadratically convergent under the standard assumption of restricted strong convexity and smoothness. We also establish its global convergence to a stationary point under a weaker assumption. In the special case of the compressive sensing, NHTP eventually reduces to some existing hard-thresholding algorithms with a Newton step. Consequently, our fast convergence result justifies why those algorithms perform better than without the Newton step. The efficiency of NHTP was demonstrated on both synthetic and real data in compressed sensing and sparse logistic regression.

Text
Global - Accepted Manuscript
Available under License Creative Commons Attribution.
Download (1MB)

More information

Submitted date: 9 January 2019
Accepted/In Press date: 3 February 2021
Published date: 11 February 2021
Keywords: Global convergence, Hard thresholding, Newton's method, Quadratic convergence rate, Sparse optimization, Stationary point

Identifiers

Local EPrints ID: 433233
URI: http://eprints.soton.ac.uk/id/eprint/433233
PURE UUID: 2c56acb9-cf90-46bb-9759-b6b7e253d9f1
ORCID for Shenglong Zhou: ORCID iD orcid.org/0000-0003-2843-1614
ORCID for Hou-Duo Qi: ORCID iD orcid.org/0000-0003-3481-4814

Catalogue record

Date deposited: 12 Aug 2019 16:30
Last modified: 26 Nov 2021 02:48

Export record

Contributors

Author: Shenglong Zhou ORCID iD
Author: Naihua Xiu
Author: Hou-Duo Qi ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×