On the projected subgradient method for nonsmooth convex optimization in a Hilbert space

On the projected subgradient method for nonsmooth convex optimization in a Hilbert space

0.00 Avg rating0 Votes
Article ID: iaor19992583
Country: Netherlands
Volume: 81
Issue: 1
Start Page Number: 23
End Page Number: 35
Publication Date: Mar 1998
Journal: Mathematical Programming
Authors: , ,
Abstract:

We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an ϵk-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes αk are exogenously given, satisfying Σk=0 αk = ∞, Σk=0 α2k < ∞, and ϵk is chosen so that ϵk ⩽ μαk for some μ > 0. We prove that the sequence generated in this way is weakly convergent to a minimizer if the problem has solutions, and is unbounded otherwise. Among the features of our convergence analysis, we mention that it covers the nonsmooth case, in the sense that we make no assumption of differentiability of f, and much less of Lipschitz continuity of its gradient. Also, we prove weak convergence of the whole sequence, rather than just boundedness of the sequence and optimality of its weak accumulation points, thus improving over all previously known convergence results. We present also convergence rate results.

Reviews

Required fields are marked *. Your email address will not be published.