Article ID: | iaor20042267 |
Country: | United Kingdom |
Volume: | 46 |
Issue: | 2/3 |
Start Page Number: | 455 |
End Page Number: | 461 |
Publication Date: | Jul 2003 |
Journal: | Computers & Mathematics with Applications |
Authors: | Cancelliere R. |
Keywords: | markov processes |
Some topics related to scattered data approximation and function approximation by linear superposition of basis functions are outlined. A new method to speed up the evaluation of the approximation is presented which is particularly useful when very large sets of scattered data are involved as in hypersurface reconstruction, image recognition, speech, and processing. This method is based on the technique of principal components analysis and allows us to select and use only the salient features needed to correctly classify patterns. The error that this technique introduces is analyzed in the context of the application to sigmoidal and radial neural networks and overestimations for it are given. The new method is also compared with some other feature selection techniques to illustrate how to apply it and to show its effectiveness.