Delaunay-based derivative-free optimization via global surrogates, part I: linear constraints

Delaunay-based derivative-free optimization via global surrogates, part I: linear constraints

0.00 Avg rating0 Votes
Article ID: iaor20163637
Volume: 66
Issue: 3
Start Page Number: 331
End Page Number: 382
Publication Date: Nov 2016
Journal: Journal of Global Optimization
Authors: , ,
Keywords: heuristics, matrices
Abstract:

A new derivative‐free optimization algorithm is introduced for nonconvex functions within a feasible domain bounded by linear constraints. Global convergence is guaranteed for twice differentiable functions with bounded Hessian, and is found to be remarkably efficient even for many functions which are not differentiable. Like other Response Surface Methods, at each optimization step, the algorithm minimizes a metric combining an interpolation of existing function evaluations and a model of the uncertainty of this interpolation. By adjusting the respective weighting of these two terms, the algorithm incorporates a tunable balance between global exploration and local refinement; a rule to adjust this balance automatically is also presented. Unlike other methods, any well‐behaved interpolation strategy may be used. The uncertainty model is built upon the framework of a Delaunay triangulation of existing datapoints in parameter space. A quadratic function which goes to zero at each datapoint is formed within each simplex of this triangulation; the union of each of these quadratics forms the desired uncertainty model. Care is taken to ensure that function evaluations are performed at points that are well situated in parameter space; that is, such that the simplices of the resulting triangulation have circumradii with a known bound. This facilitates well‐behaved local refinement as additional function evaluations are performed.

Reviews

Required fields are marked *. Your email address will not be published.