MGProx: a nonsmooth multigrid proximal gradient method with adaptive restriction for strongly convex optimization
MGProx: a nonsmooth multigrid proximal gradient method with adaptive restriction for strongly convex optimization
We study the combination of proximal gradient descent with multigrid for solving a class of possibly nonsmooth strongly convex optimization problems. We propose a multigrid proximal gradient method called MGProx, which accelerates the proximal gradient method by multigrid, based on using hierarchical information of the optimization problem. MGProx applies a newly introduced adaptive restriction operator to simplify the Minkowski sum of subdifferentials of the nondifferentiable objective function across different levels. We provide a theoretical characterization of MGProx. First we show that the MGProx update operator exhibits a fixed-point property. Next, we show that the coarse correction is a descent direction for the fine variable of the original fine level problem in the general nonsmooth case. Last, under some assumptions we provide the convergence rate for the algorithm. In the numerical tests on the elastic obstacle problem, which is an example of a nonsmooth convex optimization problem where the multigrid method can be applied, we show that MGProx has a faster convergence speed than competing methods.
multigrid, Restriction, proximal gradient, subdifferential, convex optimization, obstacle problem
2788-2820
Ang, Andersen
ed509ecd-39a3-4887-a709-339fdaded867
De Sterck, Hans
2ed04478-7382-446f-93a7-6ce8462049eb
Vavasis, Stephen
391525e9-1861-4a15-b08f-d1c9d5840520
13 August 2024
Ang, Andersen
ed509ecd-39a3-4887-a709-339fdaded867
De Sterck, Hans
2ed04478-7382-446f-93a7-6ce8462049eb
Vavasis, Stephen
391525e9-1861-4a15-b08f-d1c9d5840520
Ang, Andersen, De Sterck, Hans and Vavasis, Stephen
(2024)
MGProx: a nonsmooth multigrid proximal gradient method with adaptive restriction for strongly convex optimization.
SIAM Journal on Optimization, 34 (3), .
(doi:10.1137/23M1552140).
Abstract
We study the combination of proximal gradient descent with multigrid for solving a class of possibly nonsmooth strongly convex optimization problems. We propose a multigrid proximal gradient method called MGProx, which accelerates the proximal gradient method by multigrid, based on using hierarchical information of the optimization problem. MGProx applies a newly introduced adaptive restriction operator to simplify the Minkowski sum of subdifferentials of the nondifferentiable objective function across different levels. We provide a theoretical characterization of MGProx. First we show that the MGProx update operator exhibits a fixed-point property. Next, we show that the coarse correction is a descent direction for the fine variable of the original fine level problem in the general nonsmooth case. Last, under some assumptions we provide the convergence rate for the algorithm. In the numerical tests on the elastic obstacle problem, which is an example of a nonsmooth convex optimization problem where the multigrid method can be applied, we show that MGProx has a faster convergence speed than competing methods.
Text
2302.04077v3
- Accepted Manuscript
Text
23m1552140
- Version of Record
Restricted to Repository staff only
Request a copy
More information
Accepted/In Press date: 9 May 2024
e-pub ahead of print date: 13 August 2024
Published date: 13 August 2024
Keywords:
multigrid, Restriction, proximal gradient, subdifferential, convex optimization, obstacle problem
Identifiers
Local EPrints ID: 491047
URI: http://eprints.soton.ac.uk/id/eprint/491047
ISSN: 1052-6234
PURE UUID: f75f1fcf-519b-49f8-ae59-88785004d76e
Catalogue record
Date deposited: 11 Jun 2024 16:45
Last modified: 24 Oct 2024 02:06
Export record
Altmetrics
Contributors
Author:
Andersen Ang
Author:
Hans De Sterck
Author:
Stephen Vavasis
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics