Global relaxation-based LP-Newton method for multiple hyperparameter selection in support vector classification with feature selection
Global relaxation-based LP-Newton method for multiple hyperparameter selection in support vector classification with feature selection
Support vector classification (SVC) is an effective tool for classification tasks in machine learning. Its performance relies on the selection of appropriate hyperparameters. This paper focuses on optimizing the regularization hyperparameter C and determining feature bounds for feature selection within SVC, leading to a potentially large hyperparameter space. It is well known in machine learning that this can lead to the so-called curse of dimensionality. To address this challenge of multiple hyperparameter selection, the problem is formulated as a bilevel optimization problem, which is then transformed into a mathematical program with equilibrium constraints (MPEC). Our primary contributions are twofold. First, we establish the satisfaction of the MPEC-MFCQ for our problem reformulation. Furthermore, we introduce a novel global relaxation-based linear programming (LP)-Newton method (GRLPN) for solving this problem and provide corresponding convergence results. Typically, in global relaxation methods for MPECs, the algorithm for the corresponding subproblem is treated as a black box. Possibly for the first time in the literature, the subproblem is specifically studied in detail. Numerical experiments demonstrate GRLPN's superiority in efficiency and accuracy over both grid search and traditional global relaxation methods solved using the well-known nonlinear programming solver SNOPT.
math.OC
Qian, Yaru
6ea5556a-5d58-479d-958e-00605b53a92a
Li, Qingna
a189d836-f8f0-407b-9983-0a73bf8a214a
Zemkoho, Alain
30c79e30-9879-48bd-8d0b-e2fbbc01269e
Qian, Yaru
6ea5556a-5d58-479d-958e-00605b53a92a
Li, Qingna
a189d836-f8f0-407b-9983-0a73bf8a214a
Zemkoho, Alain
30c79e30-9879-48bd-8d0b-e2fbbc01269e
[Unknown type: UNSPECIFIED]
Abstract
Support vector classification (SVC) is an effective tool for classification tasks in machine learning. Its performance relies on the selection of appropriate hyperparameters. This paper focuses on optimizing the regularization hyperparameter C and determining feature bounds for feature selection within SVC, leading to a potentially large hyperparameter space. It is well known in machine learning that this can lead to the so-called curse of dimensionality. To address this challenge of multiple hyperparameter selection, the problem is formulated as a bilevel optimization problem, which is then transformed into a mathematical program with equilibrium constraints (MPEC). Our primary contributions are twofold. First, we establish the satisfaction of the MPEC-MFCQ for our problem reformulation. Furthermore, we introduce a novel global relaxation-based linear programming (LP)-Newton method (GRLPN) for solving this problem and provide corresponding convergence results. Typically, in global relaxation methods for MPECs, the algorithm for the corresponding subproblem is treated as a black box. Possibly for the first time in the literature, the subproblem is specifically studied in detail. Numerical experiments demonstrate GRLPN's superiority in efficiency and accuracy over both grid search and traditional global relaxation methods solved using the well-known nonlinear programming solver SNOPT.
Text
2312.10848v2
- Author's Original
More information
Accepted/In Press date: 17 December 2023
Additional Information:
25 pages, 2 tables, 4 figures
Keywords:
math.OC
Identifiers
Local EPrints ID: 508588
URI: http://eprints.soton.ac.uk/id/eprint/508588
PURE UUID: 24603979-b1a6-493e-a302-442f3616eaab
Catalogue record
Date deposited: 27 Jan 2026 18:05
Last modified: 28 Jan 2026 03:37
Export record
Altmetrics
Contributors
Author:
Yaru Qian
Author:
Qingna Li
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics