Article ID: | iaor20002988 |
Country: | Poland |
Issue: | 3/4 |
Start Page Number: | 151 |
End Page Number: | 159 |
Publication Date: | Jan 1996 |
Journal: | Badania Operacyjne I Decyzje |
Authors: | Meier Daniel Christopher |
Generally, classical neural network algorithms construct classification systems by means of hyperplanes, very general and problem unspecific. In order to build efficient classifiers yielding good generalization power with a restricted number of input factors, we propose structurally constrained neural networks (SCNN). The idea of SCNN can be applied to hetero- and auto-associative memories, topographic feature maps as well as supervised learning. In this article, a particular class of neural networks, namely Structurally Constrained Resource-Allocating Networks (SCRAN) are introduced. The structures of the individual hidden neurons are constrained by continuity. In geometric terms, the constraint corresponds to irregular, concave polytopes which are equivalent to hyperplanes in classical networks. The method was tested on the problem of creditworthiness analysis in specific industry sectors. The performed sensitivity analysis with cross-validation showed SCRAN to outperform standard resource-allocating networks in their learning and generalization ability. Moreover, the number of needed input factors is considerably reduced.