Article ID: | iaor20115671 |
Volume: | 217 |
Issue: | 22 |
Start Page Number: | 9041 |
End Page Number: | 9052 |
Publication Date: | Jul 2011 |
Journal: | Applied Mathematics and Computation |
Authors: | Chen Xiaohong, Chen Songcan, Xue Hui |
Keywords: | statistics: regression |
In this paper, a novel supervised dimensionality reduction method is developed based on both the correlation analysis and the idea of large margin learning. The method aims to maximize the minimal correlation between each dimensionality‐reduced instance and its class label, thus named as large correlation analysis (LCA). Unlike most existing correlation analysis methods such as CCA, CCAs and CDA, which all maximize the total or ensemble correlation over all training instances, LCA devotes to maximizing the individual correlations between given instances and its associated labels and is established by solving a relaxed quadratic programming with box‐constraints. Experimental results on real‐world datasets from both UCI and USPS show its effectiveness compared to the existing canonical correlation analysis methods.