Optimal step sizes in semi-stochastic approximation procedures. I

Optimal step sizes in semi-stochastic approximation procedures. I

0.00 Avg rating0 Votes
Article ID: iaor19901127
Country: Germany
Volume: 21
Start Page Number: 123
End Page Number: 153
Publication Date: Sep 1990
Journal: Optimization
Authors: ,
Abstract:

Stochastic approximation procedures, e.g. stochastic gradient methods, can be accelerated considerably by using deterministic descent directions or more exact gradient estimations at certain iteration points. Having several methods for finding (feasible) deterministic directions of decrease or improved search directions, the remaining problem is now to find an appropriate sequence of step sizes. Using the standard assumptions of (pure) stochastic approximation methods and developing corresponding conditions (‘descent direction qualifications’) for the deterministic descent directions, recurrence relations for the computation of upper and lower bounds for the estimation errors are derived first. The geometrical meaning of the descent direction qualifications is discussed in detail. Minimizing then the mean square argument errors, the mean criterial errors, resp., arising during the semi-stochastic approximation process with respect to the step sizes, recursions for the resulting optimal error bounds are obtained. Besides a basic formula for the optimal step sizes, the authors find that the resulting optimal step sizes are also globally optimal. Moreover, they find that the optimally controlled process has a strictly monotone decreasing behavior (in the mean).

Reviews

Required fields are marked *. Your email address will not be published.