We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an ϵk-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes αk are exogenously given, satisfying Σ∞k=0 αk = ∞, Σ∞k=0 α2k < ∞, and ϵk is chosen so that ϵk ⩽ μαk for some μ > 0. We prove that the sequence generated in this way is weakly convergent to a minimizer if the problem has solutions, and is unbounded otherwise. Among the features of our convergence analysis, we mention that it covers the nonsmooth case, in the sense that we make no assumption of differentiability of f, and much less of Lipschitz continuity of its gradient. Also, we prove weak convergence of the whole sequence, rather than just boundedness of the sequence and optimality of its weak accumulation points, thus improving over all previously known convergence results. We present also convergence rate results.