Article ID: | iaor20163637 |
Volume: | 66 |
Issue: | 3 |
Start Page Number: | 331 |
End Page Number: | 382 |
Publication Date: | Nov 2016 |
Journal: | Journal of Global Optimization |
Authors: | Beyhaghi Pooriya, Cavaglieri Daniele, Bewley Thomas |
Keywords: | heuristics, matrices |
A new derivative‐free optimization algorithm is introduced for nonconvex functions within a feasible domain bounded by linear constraints. Global convergence is guaranteed for twice differentiable functions with bounded Hessian, and is found to be remarkably efficient even for many functions which are not differentiable. Like other Response Surface Methods, at each optimization step, the algorithm minimizes a metric combining an interpolation of existing function evaluations and a model of the uncertainty of this interpolation. By adjusting the respective weighting of these two terms, the algorithm incorporates a tunable balance between global exploration and local refinement; a rule to adjust this balance automatically is also presented. Unlike other methods, any well‐behaved interpolation strategy may be used. The uncertainty model is built upon the framework of a Delaunay triangulation of existing datapoints in parameter space. A quadratic function which goes to zero at each datapoint is formed within each simplex of this triangulation; the union of each of these quadratics forms the desired uncertainty model. Care is taken to ensure that function evaluations are performed at points that are