Article ID: | iaor20042849 |
Country: | Japan |
Volume: | 46 |
Issue: | 4 |
Start Page Number: | 395 |
End Page Number: | 408 |
Publication Date: | Dec 2003 |
Journal: | Journal of the Operations Research Society of Japan |
Authors: | Mori Masao, Yajima Yasutoshi, Ohi Hiroko |
Keywords: | datamining |
We propose linear programming formulations of support vector machines (SVM). Unlike standard SVMs which use quadratic programs, our approach explores a fairly small dimensional subspace of a feature space to construct the nonlinear discriminator. This allows us to obtain the discriminator by solving a smaller sized linear program. We demonstrate that an orthonormal basis of the subspace can be implicitly treated by eigenvectors of the Gram matrix defined by the associated kernel function. When the number of given data points is very large, we construct a subspace by random sampling of data points. Numerical experiments indicate that the subspace generated by less than 2% of the entire training data points achieves reasonable performance for a fairly large instance with 60000 data points.