Gradient-based back-propagation dynamical iterative learning scheme for the neuro-fuzzy inference system

Gradient-based back-propagation dynamical iterative learning scheme for the neuro-fuzzy inference system

0.00 Avg rating0 Votes
Article ID: iaor2016620
Volume: 33
Issue: 1
Start Page Number: 70
End Page Number: 76
Publication Date: Feb 2016
Journal: Expert Systems
Authors: , , ,
Keywords: neural networks, fuzzy sets, learning, optimization
Abstract:

In this paper, a gradient‐based back propagation dynamical iterative learning algorithm is proposed for structure optimization and parameter tuning of the neuro‐fuzzy system. Premise and consequent parameters of the neuro‐fuzzy model are initialized randomly and then tuned by the proposed iterative algorithm. The learning algorithm is based on the first order partial derivative of the output with respect to the structure parameters. The first order derivative of the model output with respect to the structure parameters determines the sensitivity of the model to structure parameters. The sensitivity values are then used to set the tuning factors and parameters updating step sizes. Therefore, an adaptive dynamical iterative scheme is achieved which adapts the learning procedure to the current state of the performance during the optimization process. Larger tuning step sizes make the convergence speed higher and vice versa. In this regard, this parameter is treated according to the calculated sensitivity of the model to the parameter. The proposed learning algorithm is compared with the least square back propagation method, genetic algorithm and chaotic genetic algorithm in the neuro‐fuzzy model structure optimization. Smaller mean square error and shorter learning time are sought in this paper, and the performance of the proposed learning algorithm is versified regarding these criteria.

Reviews

Required fields are marked *. Your email address will not be published.