Article ID: | iaor2008466 |
Country: | United Kingdom |
Volume: | 33 |
Issue: | 3 |
Start Page Number: | 255 |
End Page Number: | 265 |
Publication Date: | Jun 2005 |
Journal: | OMEGA |
Authors: | Ragsdale Cliff, Bergey Paul K. |
Keywords: | artificial intelligence, optimization, programming: nonlinear |
Over the past three decades Evolutionary Algorithms have emerged as a powerful mechanism for finding solutions to large and complex problems. A promising new evolutionary algorithm known as Differential Evolution (DE) was recently introduced and has garnered significant attention in the research literature. This paper introduces a modification to DE that enhances its rate of convergence without compromising solution quality. DE was recently shown to outperform several well-known stochastic optimization methods on an extensive set of test problems. Our Modified Differential Evolution (MDE) algorithm utilizes selection pressure to develop offspring that are more fit to survive than those generated from purely random operators. We demonstrate that MDE requires less computational effort to locate global optimal solutions to well-known test problems in the continuous domain.