A feasible SQP‐GS algorithm for nonconvex, nonsmooth constrained optimization

A feasible SQP‐GS algorithm for nonconvex, nonsmooth constrained optimization

0.00 Avg rating0 Votes
Article ID: iaor2014215
Volume: 65
Issue: 1
Start Page Number: 1
End Page Number: 22
Publication Date: Jan 2014
Journal: Numerical Algorithms
Authors: , , ,
Keywords: global optimization, nonsmooth optimization, gradient search
Abstract:

The gradient sampling (GS) algorithm for minimizing a nonconvex, nonsmooth function was proposed by Burke et al. (2005), whose most interesting feature is the use of randomly sampled gradients instead of subgradients. In this paper, combining the GS technique with the sequential quadratic programming (SQP) method, we present a feasible SQP‐GS algorithm that extends the GS algorithm to nonconvex, nonsmooth constrained optimization. The proposed algorithm generates a sequence of feasible iterates, and guarantees that the objective function is monotonically decreasing. Global convergence is proved in the sense that, with probability one, every cluster point of the iterative sequence is stationary for the improvement function. Finally, some preliminary numerical results show that the proposed algorithm is effective.

Reviews

Required fields are marked *. Your email address will not be published.