Adjusted support vector machines based on a new loss function

Adjusted support vector machines based on a new loss function

0.00 Avg rating0 Votes
Article ID: iaor20101507
Volume: 174
Issue: 1
Start Page Number: 83
End Page Number: 101
Publication Date: Feb 2010
Journal: Annals of Operations Research
Authors: , ,
Keywords: classification, support vector machines
Abstract:

Support vector machine (SVM) has attracted considerable attentions recently due to its successful applications in various domains. However, by maximizing the margin of separation between the two classes in a binary classification problem, the SVM solutions often suffer two serious drawbacks. First, SVM separating hyperplane is usually very sensitive to training samples since it strongly depends on support vectors which are only a few points located on the wrong side of the corresponding margin boundaries. Second, the separating hyperplane is equidistant to the two classes which are considered equally important when optimizing the separating hyperplane location regardless the number of training data and their dispersions in each class. In this paper, we propose a new SVM solution, adjusted support vector machine (ASVM), based on a new loss function to adjust the SVM solution taking into account the sample sizes and dispersions of the two classes. Numerical experiments show that the ASVM outperforms conventional SVM, especially when the two classes have large differences in sample size and dispersion.

Reviews

Required fields are marked *. Your email address will not be published.