A convergent iterative hard thresholding for nonnegative sparsity optimization
A convergent iterative hard thresholding for nonnegative sparsity optimization
The iterative hard thresholding (IHT) algorithm is a popular greedy-type method in (linear and nonlinear) compressed sensing and sparse optimization problems.
In this paper, we give an improved iterative hard thresholding algorithm for solving the nonnegative sparsity optimization (NSO) by employing the Armijo-type stepsize rule, which automatically adjusts the stepsize and support set and leads to a sufficient decrease of the objective function each iteration.
Consequently, the improved IHT algorithm enjoys several convergence properties under standard assumptions. Those include the convergence to $\alpha$-stationary point (also known as $L$-stationary point in literature if the objective function has Lipschitz gradient) and the finite identification of the true support set. We also characterize when the full sequence converges to a local minimizer of NSO and establish its linear convergence rate. Extensive numerical experiments are included to demonstrate the good performance of the proposed algorithm.
sparsity constrained optimization
325-353
Pan, Lili
9e19dd08-99c8-47b4-ace4-0f4de73efe61
Zhou, Shenglong
d183edc9-a9f6-4b07-a140-a82213dbd8c3
Xiu, Naihua
8b5770f7-ae35-4dbe-884a-02fb4ea27bee
Qi, Hou-Duo
e9789eb9-c2bc-4b63-9acb-c7e753cc9a85
May 2017
Pan, Lili
9e19dd08-99c8-47b4-ace4-0f4de73efe61
Zhou, Shenglong
d183edc9-a9f6-4b07-a140-a82213dbd8c3
Xiu, Naihua
8b5770f7-ae35-4dbe-884a-02fb4ea27bee
Qi, Hou-Duo
e9789eb9-c2bc-4b63-9acb-c7e753cc9a85
Pan, Lili, Zhou, Shenglong, Xiu, Naihua and Qi, Hou-Duo
(2017)
A convergent iterative hard thresholding for nonnegative sparsity optimization.
Pacific Journal of Optimization, 13 (2), .
Abstract
The iterative hard thresholding (IHT) algorithm is a popular greedy-type method in (linear and nonlinear) compressed sensing and sparse optimization problems.
In this paper, we give an improved iterative hard thresholding algorithm for solving the nonnegative sparsity optimization (NSO) by employing the Armijo-type stepsize rule, which automatically adjusts the stepsize and support set and leads to a sufficient decrease of the objective function each iteration.
Consequently, the improved IHT algorithm enjoys several convergence properties under standard assumptions. Those include the convergence to $\alpha$-stationary point (also known as $L$-stationary point in literature if the objective function has Lipschitz gradient) and the finite identification of the true support set. We also characterize when the full sequence converges to a local minimizer of NSO and establish its linear convergence rate. Extensive numerical experiments are included to demonstrate the good performance of the proposed algorithm.
Text
CIHT
- Accepted Manuscript
More information
Accepted/In Press date: 21 February 2017
Published date: May 2017
Keywords:
sparsity constrained optimization
Organisations:
Mathematical Sciences, Operational Research
Identifiers
Local EPrints ID: 408654
URI: http://eprints.soton.ac.uk/id/eprint/408654
ISSN: 1348-9151
PURE UUID: 233cd7bc-edf9-4d1e-8970-decd59cacf4f
Catalogue record
Date deposited: 25 May 2017 04:03
Last modified: 16 Mar 2024 03:41
Export record
Contributors
Author:
Lili Pan
Author:
Shenglong Zhou
Author:
Naihua Xiu
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics