The uset of random weights for the training of multilayer networks of neurons with Heaviside characteristics

The uset of random weights for the training of multilayer networks of neurons with Heaviside characteristics

0.00 Avg rating0 Votes
Article ID: iaor19971026
Country: United Kingdom
Volume: 22
Issue: 10/12
Start Page Number: 53
End Page Number: 61
Publication Date: Nov 1995
Journal: Mathematical and Computer Modelling
Authors: ,
Abstract:

Artificial neural networks have, in recent years, been very successfully applied in a wide range of areas. A major reason for this success has been the existence of a training algorithm called backpropagation. This algorithm relies upon the neural units in a network having input/output characteristics that are continuously differentiable. Such units are significantly less easy to implement in silicon than are neural units with Heaviside (step-function) characteristics. In this paper, the authors show how a training algorithm similar to backpropagation can be developed for 2-layer networks of Heaviside units by treating the network weights (i.e., interconnection strengths) as random variables. This is then used as a basis for the development of a training algorithm for networks with any number of layers by drawing upon the idea of internal representations. Some examples are given to illustrate the performance of these learning algorithms.

Reviews

Required fields are marked *. Your email address will not be published.