MGProx: a nonsmooth multigrid proximal gradient method with adaptive restriction for strongly convex optimization
MGProx: a nonsmooth multigrid proximal gradient method with adaptive restriction for strongly convex optimization
We study the combination of proximal gradient descent with multigrid for solving a class of possibly nonsmooth strongly convex optimization problems. We propose a multigrid proximal gradient method called MGProx, which accelerates the proximal gradient method by multigrid, based on using hierarchical information of the optimization problem. MGProx applies a newly introduced adaptive restriction operator to simplify the Minkowski sum of subdifferentials of the nondifferentiable objective function across different levels. We provide a theoretical characterization of MGProx. First we show that the MGProx update operator exhibits a fixed-point property. Next, we show that the coarse correction is a descent direction for the fine variable of the original fine level problem in the general nonsmooth case. Lastly, under some assumptions we provide the convergence rate for the algorithm. In the numerical tests on the Elastic Obstacle Problem, which is an example of nonsmooth convex optimization problem where multigrid method can be applied, we show that MGProx has a faster convergence speed than competing methods.
Ang, Andersen
ed509ecd-39a3-4887-a709-339fdaded867
De Sterck, Hans
2ed04478-7382-446f-93a7-6ce8462049eb
Vavasis, Stephen
391525e9-1861-4a15-b08f-d1c9d5840520
10 May 2024
Ang, Andersen
ed509ecd-39a3-4887-a709-339fdaded867
De Sterck, Hans
2ed04478-7382-446f-93a7-6ce8462049eb
Vavasis, Stephen
391525e9-1861-4a15-b08f-d1c9d5840520
[Unknown type: UNSPECIFIED]
Abstract
We study the combination of proximal gradient descent with multigrid for solving a class of possibly nonsmooth strongly convex optimization problems. We propose a multigrid proximal gradient method called MGProx, which accelerates the proximal gradient method by multigrid, based on using hierarchical information of the optimization problem. MGProx applies a newly introduced adaptive restriction operator to simplify the Minkowski sum of subdifferentials of the nondifferentiable objective function across different levels. We provide a theoretical characterization of MGProx. First we show that the MGProx update operator exhibits a fixed-point property. Next, we show that the coarse correction is a descent direction for the fine variable of the original fine level problem in the general nonsmooth case. Lastly, under some assumptions we provide the convergence rate for the algorithm. In the numerical tests on the Elastic Obstacle Problem, which is an example of nonsmooth convex optimization problem where multigrid method can be applied, we show that MGProx has a faster convergence speed than competing methods.
Text
2302.04077v3
- Author's Original
More information
Published date: 10 May 2024
Identifiers
Local EPrints ID: 491047
URI: http://eprints.soton.ac.uk/id/eprint/491047
PURE UUID: f75f1fcf-519b-49f8-ae59-88785004d76e
Catalogue record
Date deposited: 11 Jun 2024 16:45
Last modified: 12 Jun 2024 02:08
Export record
Contributors
Author:
Andersen Ang
Author:
Hans De Sterck
Author:
Stephen Vavasis
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics