Global optimization for artificial neural networks: A tabu search application

Global optimization for artificial neural networks: A tabu search application

0.00 Avg rating0 Votes
Article ID: iaor19992555
Country: Netherlands
Volume: 106
Issue: 2/3
Start Page Number: 570
End Page Number: 584
Publication Date: Apr 1998
Journal: European Journal of Operational Research
Authors: , , ,
Keywords: heuristics
Abstract:

The ability of neural networks to closely approximate unknown functions to any degree of desired accuracy has generated considerable demand for neural network research in business. The attractiveness of neural network research stems from researchers' need to approximate models within the business environment without having a priori knowledge about the true underlying function. Gradient techniques, such as backpropagation, are currently the most widely used methods for neural network optimization. Since these techniques search for local solutions, they are subject to local convergence and thus can perform poorly even on simple problems when forecasting out-of-sample. Consequently, a global search algorithm is warranted. In this paper we examine tabu search (TS) as a possible alternative to the problematic backpropagation approach. A Monte Carlo study was conducted to test the appropriateness of TS as a global search technique for optimizing neural networks. Holding the neural network architecture constant, 530 independent runs were conducted for each of seven test functions, including a production function that exhibits both increasing and diminishing marginal returns and the Mackey–Glass chaotic time series. In the resulting comparison, TS derived solutions that were significantly superior to those of backpropagation solutions for in-sample, interpolation, and extrapolation test data for all seven test functions. It was also shown that fewer function evaluations were needed to find these optimal values.

Reviews

Required fields are marked *. Your email address will not be published.