Efficient global optimization algorithm assisted by multiple surrogate techniques

Efficient global optimization algorithm assisted by multiple surrogate techniques

0.00 Avg rating0 Votes
Article ID: iaor20134128
Volume: 56
Issue: 2
Start Page Number: 669
End Page Number: 689
Publication Date: Jun 2013
Journal: Journal of Global Optimization
Authors: , ,
Keywords: simulation: applications, neural networks
Abstract:

Surrogate‐based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall‐clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO‐like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.

Reviews

Required fields are marked *. Your email address will not be published.