Article ID: | iaor20106365 |
Volume: | 47 |
Issue: | 2 |
Start Page Number: | 179 |
End Page Number: | 206 |
Publication Date: | Oct 2010 |
Journal: | Computational Optimization and Applications |
Authors: | Tseng Paul, Yun Sangwoon |
Keywords: | support vector machines |
Support vector machines (SVMs) training may be posed as a large quadratic program (QP) with bound constraints and a single linear equality constraint. We propose a (block) coordinate gradient descent method for solving this problem and, more generally, linearly constrained smooth optimization. Our method is closely related to decomposition methods currently popular for SVM training. We establish global convergence and, under a local error bound assumption (which is satisfied by the SVM QP), linear rate of convergence for our method when the coordinate block is chosen by a Gauss-Southwell-type rule to ensure sufficient descent. We show that, for the SVM QP with