Convergence of One-Step Projected Gradient Methods for Variational Inequalities

Convergence of One-Step Projected Gradient Methods for Variational Inequalities

0.00 Avg rating0 Votes
Article ID: iaor20163680
Volume: 171
Issue: 1
Start Page Number: 146
End Page Number: 168
Publication Date: Oct 2016
Journal: Journal of Optimization Theory and Applications
Authors: ,
Keywords: programming: mathematical, heuristics
Abstract:

In this paper, we revisit the numerical approach to some classical variational inequalities, with monotone and Lipschitz continuous mapping A, by means of a projected reflected gradient‐type method. A main feature of the method is that it formally requires only one projection step onto the feasible set and one evaluation of the involved mapping per iteration. Contrary to what was done so far, we establish the convergence of the method in a more general setting that allows us to use varying step‐sizes without any requirement of additional projections. A linear convergence rate is obtained, when A is assumed to be strongly monotone. Preliminary numerical experiments are also performed.

Reviews

Required fields are marked *. Your email address will not be published.