Computing gradients in large-scale optimization using automatic differentiation

Computing gradients in large-scale optimization using automatic differentiation

0.00 Avg rating0 Votes
Article ID: iaor1999294
Country: United States
Volume: 9
Issue: 2
Start Page Number: 185
End Page Number: 194
Publication Date: Mar 1997
Journal: INFORMS Journal On Computing
Authors: , , ,
Keywords: gradient methods, optimization
Abstract:

The accurate and efficient computation of gradients for partially separable functions is central to the solution of large-scale optimization problems, because these functions are ubiquitous in large-scale problems. We describe two approaches for computing gradients of partially separable functions via automatic differentiation. In our experiments we employ the ADIFOR (automatic differentiation of Fortran) tool and the SparsLinC (sparse linear combination) library. We use applications from the MINPACK-2 test problem collection to compare the numerical reliability and computational efficiency of these approaches with hand-coded derivatives and approximations based on differences of function values. Our conclusion is that automatic differentiation is the method of choice, providing code for the efficient computation of the gradient without the need for tedious hand-coding.

Reviews

Required fields are marked *. Your email address will not be published.