Article ID: | iaor20023442 |
Country: | United States |
Volume: | 111 |
Issue: | 2 |
Start Page Number: | 359 |
End Page Number: | 379 |
Publication Date: | Nov 2001 |
Journal: | Journal of Optimization Theory and Applications |
Authors: | Liu W., Dai Y.H. |
Keywords: | game theory, search |
In the present work, we explore a general framework for the design of new minimization algorithms with desirable characteristics, namely, supervisor–searcher co-operation. We propose a class of algorithms within this framework and examine a gradient algorithm in the class. Global convergence is established for the deterministic case in the absence of noise and the convergence rate is studied. Both theoretical analysis and numerical tests show that the algorithm is efficient for the deterministic case. Furthermore, the fact that there is no line search procedure incorporated in the algorithm seems to strengthen its robustness so that it tackles effectively test problems with stronger stochastic noises. The numerical results for both deterministic and stochastic test problems illustrate the appealing attributes of the algorithm.