Gradient algorithms for quadratic optimization with fast convergence rates

Gradient algorithms for quadratic optimization with fast convergence rates

0.00 Avg rating0 Votes
Article ID: iaor201111690
Volume: 50
Issue: 3
Start Page Number: 597
End Page Number: 617
Publication Date: Dec 2011
Journal: Computational Optimization and Applications
Authors: ,
Keywords: optimization
Abstract:

We propose a family of gradient algorithms for minimizing a quadratic function f(x)=(Ax,x)/2−(x,y) in ℝ d or a Hilbert space, with simple rules for choosing the step‐size at each iteration. We show that when the step‐sizes are generated by a dynamical system with ergodic distribution having the arcsine density on a subinterval of the spectrum of A, the asymptotic rate of convergence of the algorithm can approach the (tight) bound on the rate of convergence of a conjugate gradient algorithm stopped before d iterations, with d≤∞ the space dimension.

Reviews

Required fields are marked *. Your email address will not be published.