An improved neural classification network for the two-group problem

An improved neural classification network for the two-group problem

0.00 Avg rating0 Votes
Article ID: iaor20001755
Country: United Kingdom
Volume: 26
Issue: 5
Start Page Number: 443
End Page Number: 460
Publication Date: Apr 1999
Journal: Computers and Operations Research
Authors: ,
Keywords: statistics: general, statistics: regression
Abstract:

In this paper we present the neural network model known as the mixture-of-experts (MOE) and determine its accuracy and its robustness. We do this by comparing the classification accuracy of MOE, back-propagation neural network (BPN), Fisher's discriminant analysis, logistics regression, k nearest neighbor, and the kernal density on five real-world two-group data sets. Our results lead to three major conclusions: (1) the MOE network architecture is more accurate than BPN; (2) MOE tends to be more accurate than the parametric and non-parametric methods investigated; (3) MOE is a far more robust classifier than the other methods for the two-group problem.

Reviews

Required fields are marked *. Your email address will not be published.