Global convergence of conjugate gradient methods without line search

Global convergence of conjugate gradient methods without line search

0.00 Avg rating0 Votes
Article ID: iaor20022504
Country: Netherlands
Volume: 103
Issue: 1
Start Page Number: 161
End Page Number: 173
Publication Date: Mar 2001
Journal: Annals of Operations Research
Authors: ,
Abstract:

Global convergence results are derived for well-known conjugate gradient methods in which the line search step is replaced by a step whose length is determined by a formula. The results include the following cases: (1) The Fletcher–Reeves method, the Hestenes–Stiefel method, and the Dai–Yuan method applied to a strongly convex LC1 objective function; (2) The Polak–Ribière method and the Conjugate Descent method applied to a general, not necessarily convex, LC1 objective function.

Reviews

Required fields are marked *. Your email address will not be published.