|Start Page Number:||121|
|End Page Number:||145|
|Publication Date:||Oct 2016|
|Journal:||Journal of Optimization Theory and Applications|
|Authors:||Dvurechensky Pavel, Gasnikov Alexander|
|Keywords:||stochastic processes, programming: convex|
In this paper, we introduce new methods for convex optimization problems with stochastic inexact oracle. Our first method is an extension of the Intermediate Gradient Method proposed by Devolder, Glineur and Nesterov for problems with deterministic inexact oracle. Our method can be applied to problems with composite objective function, both deterministic and stochastic inexactness of the oracle, and allows using a non‐Euclidean setup. We estimate the rate of convergence in terms of the expectation of the non‐optimality gap and provide a way to control the probability of large deviations from this rate. Also we introduce two modifications of this method for strongly convex problems. For the first modification, we estimate the rate of convergence for the non‐optimality gap expectation and, for the second, we provide a bound for the probability of large deviations from the rate of convergence in terms of the expectation of the non‐optimality gap. All the rates lead to the complexity estimates for the proposed methods, which up to a multiplicative constant coincide with the lower complexity bound for the considered class of convex composite optimization problems with stochastic inexact oracle.