A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties

A new general form of conjugate gradient methods with guaranteed descent and strong global convergence properties

0.00 Avg rating0 Votes
Article ID: iaor20123762
Volume: 60
Issue: 1
Start Page Number: 135
End Page Number: 152
Publication Date: May 2012
Journal: Numerical Algorithms
Authors: ,
Keywords: gradient methods
Abstract:

Although the study of global convergence of the Polak–Ribière–Polyak (PRP), Hestenes–Stiefel (HS) and Liu–Storey (LS) conjugate gradient methods has made great progress, the convergence of these algorithms for general nonlinear functions is still erratic, not to mention under weak conditions on the objective function and weak line search rules. Besides, it is also interesting to investigate whether there exists a general method that converges under the standard Armijo line search for general nonconvex functions, since very few relevant results have been achieved. So in this paper, we present a new general form of conjugate gradient methods whose theoretical significance is attractive. With any formula β k ≥ 0 and under weak conditions, the proposed method satisfies the sufficient descent condition independently of the line search used and the function convexity, and its global convergence can be achieved under the standard Wolfe line search or even under the standard Armijo line search. Based on this new method, convergence results on the PRP, HS, LS, Dai–Yuan–type (DY) and Conjugate–Descent–type (CD) methods are established. Preliminary numerical results show the efficiency of the proposed methods.

Reviews

Required fields are marked *. Your email address will not be published.