Article ID: | iaor20091302 |
Country: | Japan |
Volume: | 51 |
Issue: | 1 |
Start Page Number: | 95 |
End Page Number: | 110 |
Publication Date: | Mar 2008 |
Journal: | Journal of the Operations Research Society of Japan |
Authors: | Koda Masato, Sano Natsuki, Suzuki Hideo |
Keywords: | gradient methods, neural networks, statistics: inference, datamining |
Classifier is used for pattern recognition in various fields including data mining. Boosting is an ensemble learning method to boost (enhance) an accuracy of single classifier. We propose a new, robust boosting method by using a zero–one step function as a loss function. In deriving the method, the MarginBoost technique is blended with the stochastic gradient approximation algorithm, called Stochastic Noise Reaction (SNR). Based on intensive numerical experiments, we show that the proposed method is actually better than AdaBoost on test error rates in the case of noisy, mislabeled situation.