Article ID: | iaor20061417 |
Country: | Japan |
Volume: | 48 |
Issue: | 4 |
Start Page Number: | 284 |
End Page Number: | 296 |
Publication Date: | Dec 2005 |
Journal: | Journal of the Operations Research Society of Japan |
Authors: | Yabe Hiroshi, Sakaiwa Naoki |
Keywords: | Conjugate gradient method, Global convergence |
Conjugate gradient methods are widely used for large scale unconstrained optimization problems. Most of conjugate gradient methods don't always generate a descent search direction, so the descent condition is usually assumed in the analysis and implementations. Dai and Yuan proposed a conjugate gradient method which generates a descent search direction at every iteration and converges globally to the solution if the Wolfe conditions are satisfied within the line search strategy. In this paper, we give a new conjugate gradient method based on the study of Dai and Yuan, and show that our method always produces a descent search direction and converges globally if the Wolfe conditions are satisfied. Moreover our method has the second-order curvature information with a higher precision which uses the modified secant condition proposed by Zhang