Multiple costs based decision making with back‐propagation neural networks

Multiple costs based decision making with back‐propagation neural networks

0.00 Avg rating0 Votes
Article ID: iaor20121422
Volume: 52
Issue: 3
Start Page Number: 657
End Page Number: 663
Publication Date: Feb 2012
Journal: Decision Support Systems
Authors: , , , ,
Keywords: neural networks, statistics: regression
Abstract:

The current research investigates a single cost for cost‐sensitive neural networks (CNN) for decision making. This may not be feasible for real cost‐sensitive decisions which involve multiple costs. We propose to modify the existing model, the traditional back‐propagation neural networks (TNN), by extending the back‐propagation error equation for multiple cost decisions. In this multiple‐cost extension, all costs are normalized to be in the same interval (i.e. between 0 and 1) as the error estimation generated in the TNN. A comparative analysis of accuracy dependent on three outcomes for constant costs was performed: (1) TNN and CNN with one constant cost (CNN‐1C), (2) TNN and CNN with two constant costs (CNN‐2C), and (3) CNN‐1C and CNN‐2C. A similar analysis for accuracy was also made for non‐constant costs; (1) TNN and CNN with one non‐constant cost (CNN‐1NC), (2) TNN and CNN with two non‐constant costs (CNN‐2NC), and (3) CNN‐1NC and CNN‐2NC. Furthermore, we compared the misclassification cost for CNNs for both constant and non‐constant costs (CNN‐1C vs. CNN‐2C and CNN‐1NC vs. CNN‐2NC). Our findings demonstrate that there is a competitive behavior between the accuracy and misclassification cost in the proposed CNN model. To obtain a higher accuracy and lower misclassification cost, our results suggest merging all constant cost matrices into one constant cost matrix for decision making. For multiple non‐constant cost matrices, our results suggest maintaining separate matrices to enhance the accuracy and reduce the misclassification cost.

Reviews

Required fields are marked *. Your email address will not be published.