Article ID: | iaor1989634 |
Country: | Japan |
Volume: | J72-D-2 |
Issue: | 3 |
Start Page Number: | 427 |
End Page Number: | 432 |
Publication Date: | Mar 1989 |
Journal: | Transactions of the Institute of Electronics, Information and Communication Engineers |
Authors: | Urahama Kiichi |
Keywords: | gradient methods, numerical analysis, neural networks |
Asymptotic stability of equilibrium states and convergence properties of some algorithms for computing sensitivity are investigated for a continuous-time model of neural networks and a discrete-time one. On the basis of a stability theory of dynamical systems, a class of networks is defined where every interaction between neurons is sufficiently weak. This class includes any feedforward network such as perceptrons with weak (possibly zero) feedbacks and also networks where all neurons are mutually interacting weakly. A continuous-time model, i.e. a system of differential equations describing the dynamics of this class of networks is proven to be globally asymptotically stable. The rate of convergence to its equilibrium state is inversely proportional to the strength of the interaction between neurons. A discrete-time model describing generally asynchronous behaviors of the networks are also shown to converge globally. In addition an analog method and also a general digital method for computing the sensitivity, i.e. gradient vectors of the potential of every neuron with respect to any synapse weight are globally convergent for this class of networks. [In Japanese.]