A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization

A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization

0.00 Avg rating0 Votes
Article ID: iaor201465
Volume: 11
Issue: 4
Start Page Number: 361
End Page Number: 374
Publication Date: Dec 2013
Journal: 4OR
Authors:
Keywords: gradient search
Abstract:

In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Mart&iacutenez are hybridized following Andrei’s approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and Moré.

Reviews

Required fields are marked *. Your email address will not be published.