Convergence and efficiency of subgradient methods for quasiconvex minimization

Convergence and efficiency of subgradient methods for quasiconvex minimization

0.00 Avg rating0 Votes
Article ID: iaor2002935
Country: Germany
Volume: 90
Issue: 1
Start Page Number: 1
End Page Number: 25
Publication Date: Jan 2001
Journal: Mathematical Programming
Authors:
Abstract:

We study a general subgradient projection method for minimizing a quasiconvex objective subject to a convex set constraint in a Hilbert space. Our setting is very general: the objective is only upper semicontinuous on its domain, which need not be open, and various subdifferentials may be used. We extend previous results by proving convergence in objective values and to the generalized solution set for classical stepsizes tk → 0, Σ tk = ∞, and weak or strong convergence of the iterates to a solution for {tk} ∈ l2 \ l1 under mild regularity conditions. For bounded constraint sets and suitable stepsizes, the method finds ε-solutions with an efficiency estimate of O−2), thus being optimal in the sense of Nemirovskii.

Reviews

Required fields are marked *. Your email address will not be published.