Nonmonotone BFGS‐trained recurrent neural networks for temporal sequence processing

Nonmonotone BFGS‐trained recurrent neural networks for temporal sequence processing

0.00 Avg rating0 Votes
Article ID: iaor20113274
Volume: 217
Issue: 12
Start Page Number: 5421
End Page Number: 5441
Publication Date: Feb 2011
Journal: Applied Mathematics and Computation
Authors: ,
Keywords: training
Abstract:

In this paper we propose a nonmonotone approach to recurrent neural networks training for temporal sequence processing applications. This approach allows learning performance to deteriorate in some iterations, nevertheless the network’s performance is improved over time. A self‐scaling BFGS is equipped with an adaptive nonmonotone technique that employs approximations of the Lipschitz constant and is tested on a set of sequence processing problems. Simulation results show that the proposed algorithm outperforms the BFGS as well as other methods previously applied to these sequences, providing an effective modification that is capable of training recurrent networks of various architectures.

Reviews

Required fields are marked *. Your email address will not be published.