Multiplicative interior gradient methods for minimization over the nonnegative orthant

Multiplicative interior gradient methods for minimization over the nonnegative orthant

0.00 Avg rating0 Votes
Article ID: iaor19961317
Country: United States
Volume: 34
Issue: 1
Start Page Number: 389
End Page Number: 406
Publication Date: Jan 1996
Journal: SIAM Journal on Control and Optimization
Authors: , ,
Keywords: programming: nonlinear
Abstract:

The authors introduce a new class of multiplicative iterative methods for solving minimization problems over the nonnegative orthant. The algorithm is akin to a natural extension of gradient methods for unconstrained minimization problems to the case of nonnegativity constraints, with the sepcial feature that it generates a sequence of iterates which remain in the interior of the nonnegative orthant. The authors prove that the algorithm combined with an appropriate line search is weakly convergent to a saddle point of the minimization problem, when the minimand is a differentiable function with bounded level sets. If the function is convex, then weak convergence to an optimal solution is obtained. Moreover, by using an appropriate regularized line search, the authors prove that the level set boundedness hypothesis can be removed, and full convergence of the iterates to an optimal solution is established in the convex case.

Reviews

Required fields are marked *. Your email address will not be published.