Effect of data standardization on neural network training

Effect of data standardization on neural network training

0.00 Avg rating0 Votes
Article ID: iaor19971515
Country: United Kingdom
Volume: 24
Issue: 4
Start Page Number: 385
End Page Number: 397
Publication Date: Aug 1996
Journal: OMEGA
Authors: , ,
Abstract:

Data transformation is a popular option in training neural networks. This study evaluates the effectiveness of two well-known transfomation methods: linear transformation and statistical standardization. These two are referred to as data standardization. A carefully designed experiment is used in which data from two-group classification problems were trained by feedforward networks. Different kinds of classification problems, from relatively simple to hard, were generated. Other experimental factors include network architecture, sample size, and sample proportion of group 1 members. Three performance measurements for the effect of data standardization are employed. The results suggest that networks trained on standardized data yield better results in general, but the advantage diminishes as network and sample size become large. In other words, neural networks exhibit a self-scaling capability. In addition, impact of data standardization on the performance of training algorithm in terms of computation time and number of iterations is evaluated. The results indicate that, overall, data standardization slows down training. Finally, these results are illustrated with a data set obtained from the American Telephone and Telegraph Company.

Reviews

Required fields are marked *. Your email address will not be published.