Affiliation:
1. Beijing Institute of Technology
2. University of Southampton
Abstract
Abstract
Support vector classification (SVC) is an effective tool for classification tasks in machine learning. Its performance relies on the selection of appropriate hyperparameters. In this paper, our focus is on identifying the optimal value for the regularization hyperparameter \(C\), as well as determining the bounds on the features in a SVC problem. This implies that the number of hyperparameters in our SVC can potentially be very large. It is very well-known in machine learning that this could lead to the so-called {\em curse of dimensionality}. To address this challenge of multiple hyperparameter selection, the problem is formulated as a bilevel optimization problem, which is then transformed into a mathematical program with equilibrium constraints (MPEC). Our first contribution involves proving the fulfillment of a Mangasarian–Fromovitz constraint qualification tailored to the latter reformulation of the problem. Furthermore, we introduce a novel linear programming (LP)-Newton-based global relaxation method (GRLPN) for solving this problem and provide corresponding convergence results. Typically, in global relaxation methods for MPECs, the algorithm for the corresponding subproblem is treated as a blackbox. Possibly for the first time in the literature, the subproblem is specifically studied in detail. Numerical experiments substantiate the superiority of GRLPN over grid search and the global relaxation solved by the well-known nonlinear programming solver SNOPT.
Publisher
Research Square Platform LLC