Optimizing Top-Multiclass SVM Via Semismooth Newton Algorithm
Optimizing Top-Multiclass SVM Via Semismooth Newton Algorithm
computational linguistics
Computational Theory
https://www.researchgate.net/publication/331273575_PSBCM-Link:
ALH_Algorithm_for_Training_Large-scale_Support_Vector_Machine
Abstract
In this paper, based on Osuna's decomposition algorithm, a proximal stochastic block coordinate
minimization (PSBCM) algorithm is presented for training large-scale support vector machine.
Every time, PSBCM solves a strongly convex quadratic programming sub-problem with bran-
new training samples randomly selected by a novel heuristic shrinking method which does not
calculate the whole gradient of the objective function, and then uses a presented augmented
Lagrangian homotopy (ALH) algorithm to solve the sub-problem. ALH is efficient due to
solving the augmented Lagrangian sub-problems exactly by a ho-motopy algorithm with three
important techniques: warm-start, Cholesky factorization updating and ε-precision verification
and correction. In addition, the presented adaptive parameters updating method improves the
performance and robustness of PSBCM-ALH which denotes the algorithm with PSBCM as outer
iterations and ALH as inner iterations. The numerical experiments on the LIBSVM data sets and
MIT-CBCL face detection database show that PSBCM-ALH is efficient and robust with
different kernels and has great advantages to the well-known LIBSVM. Simultaneously, for
linear kernel, PSBCM-ALH is shown to be very competitive to the state-of-art LIBLINEAR and
needs less memory than it. Finally, the results demonstrate that PSBCM-ALH has linear time
complexity when the kernel is linear.