Non-linear Dimensionality Reduction Procedures for Certain Large-Dimensional Multi-objective Optimization Problems: Employing Correntropy and a Novel Maximum Variance Unfolding

In our recent publication [1], we began with an understanding that many real-world applications of multi-objective optimization involve a large number (10 or more) of objectives but then, existing evolutionary multi-objective optimization (EMO) methods have primarily been applied to problems having smaller number of objectives (5 or less). After highlighting the major impediments in handling large number of objectives, we proposed a principal component analysis (PCA) based EMO procedure, for dimensionality reduction, whose efficacy was demonstrated by solving upto 50-objective optimization problems. Here, we are addressing the fact that, when the data points live on a non-linear manifold or that the data structure is non-gaussian, PCA which yields a smaller dimensional 'linear' subspace may be ineffective in revealing the underlying dimensionality. To overcome this, we propose two new non-linear dimensionality reduction algorithms for evolutionary multi-objective optimization, namely C-PCA-NSGA-II and MVU-PCA-NSGA-II. While the former is based on the newly introduced correntropy PCA [2], the later implements maximum variance unfolding principle [3,4,5], in a novel way. We also establish the superiority of these new EMO procedures over the earlier PCA-based procedure, both in terms of accuracy and computational time, by solving upto 50-objective optimization problems.

[1]  Kilian Q. Weinberger,et al.  An Introduction to Nonlinear Dimensionality Reduction by Maximum Variance Unfolding , 2006, AAAI.

[2]  Kilian Q. Weinberger,et al.  Learning a kernel matrix for nonlinear dimensionality reduction , 2004, ICML.

[3]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2003, ICTAI.

[4]  Marco Laumanns,et al.  Scalable multi-objective optimization test problems , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[5]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[6]  Kilian Q. Weinberger,et al.  Spectral Methods for Dimensionality Reduction , 2006, Semi-Supervised Learning.

[7]  Bernhard Schölkopf,et al.  A kernel view of the dimensionality reduction of manifolds , 2004, ICML.

[8]  José Carlos Príncipe,et al.  Nonlinear Component Analysis Based on Correntropy , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[9]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[10]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[11]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[12]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[13]  Kilian Q. Weinberger,et al.  Unsupervised Learning of Image Manifolds by Semidefinite Programming , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[14]  Bernhard Schölkopf,et al.  Learning with kernels , 2001 .

[15]  Marco Laumanns,et al.  Scalable test problems for evolutionary multi-objective optimization , 2001 .

[16]  Kalyanmoy Deb,et al.  Multi-objective optimization using evolutionary algorithms , 2001, Wiley-Interscience series in systems and optimization.

[17]  Chengjun Liu,et al.  Capitalize on dimensionality increasing techniques for improving face recognition grand challenge performance , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.