Article ID: | iaor1995713 |
Country: | United Kingdom |
Volume: | 21 |
Issue: | 8 |
Start Page Number: | 801 |
End Page Number: | 822 |
Publication Date: | Oct 1994 |
Journal: | Computers and Operations Research |
Authors: | Glover F. |
Keywords: | heuristics |
The paper identifies processes for structuring neural networks by reference to two classes of interacting mappings, one generating provisional outcomes (‘trial solutions’) and the other generating idealized representations, which are called ghost images. These mappings create an evolution both of the provisional outcomes and ghost images, which in turn influence a parallel evolution of the mappings themselves. The ghost image models may be conceived as a generalization of the self-organizing neural network models of Kohonen. Alternatively, they may be viewed as a generalization of certain relaxation/restriction procedures of mathematical optimization. Hence indirectly they also generalize aspects of penalty based neural models, such as those proposed by Hopfield and Tank. Both avenues of generalization are ‘context free’, without reliance on specialized theory, such as models of perception or mathematical duality. From a neural network standpoint, the ghost image framework makes it possible to extend previous Kohonen-based optimization approaches to incorporate components beyond a visually oriented frame of reference. This added level of abstraction yields a basis for solving optimization problems expressed entirely in symbolic (‘non-visual’) mathematical formulations. At the same time it allows penalty function ideas in neural networks to be extended to encompass other concepts springing from a mathematical optimization perspective, including