|Start Page Number:||67|
|End Page Number:||77|
|Publication Date:||Jan 2017|
|Journal:||RAIRO - Operations Research|
|Authors:||Dong Xiao Liang, Li Wei Jun, He Yu Bo|
|Keywords:||heuristics, programming: nonlinear, search|
Descent condition is a crucial factor to establish the global convergence of nonlinear conjugate gradient method. In this paper, we propose some modified Yabe–Takano conjugate gradient methods, in which the corresponding search directions always satisfy the sufficient descent property independently of the convexity of the objective function. Differently from the existent methods, a new update strategy in constructing the search direction is proposed to establish the global convergence of the presented methods for the general nonconvex objective function. Numerical results illustrate that our methods can efficiently solve the test problems and therefore is promising.