Meta-Modeling in Multiobjective Optimization

In many practical engineering design and other scientific optimization problems, the objective function is not given in closed form in terms of the design variables. Given the value of the design variables, the value of the objective function is obtained by some numerical analysis, such as structural analysis, fluidmechanic analysis, thermodynamic analysis, and so on. It may even be obtained by conducting a real (physical) experiment and taking direct measurements. Usually, these evaluations are considerably more time-consuming than evaluations of closed-form functions. In order to make the number of evaluations as few as possible, we may combine iterative search with meta-modeling . The objective function is modeled during optimization by fitting a function through the evaluated points. This model is then used to help predict the value of future search points, so that high performance regions of design space can be identified more rapidly. In this chapter, a survey of meta-modeling approaches and their suitability to specific problem contexts is given. The aspects of dimensionality, noise, expensiveness of evaluations and others, are related to choice of methods. For the multiobjective version of the meta-modeling problem, further aspects must be considered, such as how to define improvement in a Pareto approximation set, and how to model each objective function. The possibility of interactive methods combining meta-modeling with decision-making is also covered. Two example applications are included. One is a multiobjective biochemistry problem, involving instrument optimization; the other relates to seismic design in the reinforcement of cable-stayed bridges.

[1]  Jonathan E. Fieldsend,et al.  Multi-objective optimisation in the presence of uncertainty , 2005, 2005 IEEE Congress on Evolutionary Computation.

[2]  Alexander J. Smola,et al.  Learning with Kernels: support vector machines, regularization, optimization, and beyond , 2001, Adaptive computation and machine learning series.

[3]  Carlos M. Fonseca,et al.  Inferential Performance Assessment of Stochastic Optimisers and the Attainment Function , 2001, EMO.

[4]  Yaochu Jin,et al.  Multi-Objective Machine Learning , 2006, Studies in Computational Intelligence.

[5]  Hirotaka Nakayama,et al.  Optimization for Black-box Objective Functions , 2006 .

[6]  Hirotaka Nakayama,et al.  Simulation-Based Optimization Using Computational Intelligence , 2002 .

[7]  Hirotaka Nakayama,et al.  Approximate Optimization Using Computaional Intelligence and its Application to Reinforcement of Cable-stayed Bridges , 2006, Integrated Intelligent Systems for Engineering Design.

[8]  Bernhard Sendhoff,et al.  Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles , 2004, GECCO.

[9]  David Madigan,et al.  On the Naive Bayes Model for Text Categorization , 2003, AISTATS.

[10]  Laetitia Vermeulen-Jourdan,et al.  Preliminary Investigation of the 'Learnable Evolution Model' for Faster/Better Multiobjective Water Systems Design , 2005, EMO.

[11]  Nicola Beume,et al.  An EMO Algorithm Using the Hypervolume Measure as Selection Criterion , 2005, EMO.

[12]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[13]  Hirotaka Nakayama,et al.  Support Vector Regression Based on Goal Programming and Multi-objective Programming , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[14]  Bernhard Sendhoff,et al.  Structure optimization of neural networks for evolutionary design optimization , 2005, Soft Comput..

[15]  C. Fonseca,et al.  GENETIC ALGORITHMS FOR MULTI-OBJECTIVE OPTIMIZATION: FORMULATION, DISCUSSION, AND GENERALIZATION , 1993 .

[16]  T. M. English Optimization is easy and learning is hard in the typical function , 2000, Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512).

[17]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[18]  Douglas C. Montgomery,et al.  Response Surface Methodology: Process and Product Optimization Using Designed Experiments , 1995 .

[19]  Joshua D. Knowles,et al.  Closed-loop, multiobjective optimization of two-dimensional gas chromatography/mass spectrometry for serum metabolomics. , 2007, Analytical chemistry.

[20]  F. Glover,et al.  Simple but powerful goal programming models for discriminant problems , 1981 .

[21]  Peter J. Fleming,et al.  Genetic Algorithms for Multiobjective Optimization: FormulationDiscussion and Generalization , 1993, ICGA.

[22]  William J. Welch,et al.  Computer experiments and global optimization , 1997 .

[23]  Gert Cauwenberghs,et al.  Incremental and Decremental Support Vector Machine Learning , 2000, NIPS.

[24]  Manfred Grauer,et al.  Interactive Decision Analysis , 1984 .

[25]  Anthony Robins Maintaining stability during new learning in neural networks , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.

[26]  V. L. Anderson,et al.  Design of Experiments: A Realistic Approach , 1974 .

[27]  Anton Schwaighofer,et al.  Transductive and Inductive Methods for Approximate Gaussian Process Regression , 2002, NIPS.

[28]  David A. Cohn,et al.  Active Learning with Statistical Models , 1996, NIPS.

[29]  Liang Shi,et al.  Multiobjective GA optimization using reduced models , 2005, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[30]  Edmund K. Burke,et al.  Parallel Problem Solving from Nature - PPSN IX: 9th International Conference, Reykjavik, Iceland, September 9-13, 2006, Proceedings , 2006, PPSN.

[31]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[32]  Joshua D. Knowles,et al.  ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems , 2006, IEEE Transactions on Evolutionary Computation.

[33]  Evan J. Hughes Multi-Objective Equivalent Random Search , 2006, PPSN.

[34]  Evan J. Hughes,et al.  Evolutionary many-objective optimisation: many once or one many? , 2005, 2005 IEEE Congress on Evolutionary Computation.

[35]  Kazuhiro Saitou,et al.  VEHICLE CRASHWORTHINESS DESIGN VIA A SURROGATE MODEL ENSEMBLE AND A CO-EVOLUTIONARY GENETIC ALGORITHM , 2005, DAC 2005.

[36]  Peter Tiño,et al.  Managing Diversity in Regression Ensembles , 2005, J. Mach. Learn. Res..

[37]  Riccardo Poli,et al.  Genetic and Evolutionary Computation – GECCO 2004 , 2004, Lecture Notes in Computer Science.

[38]  Riccardo Poli,et al.  Foundations of Genetic Programming , 1999, Springer Berlin Heidelberg.

[39]  N. Zheng,et al.  Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models , 2006, J. Glob. Optim..

[40]  Andy J. Keane,et al.  Multi-Objective Optimization Using Surrogates , 2010 .

[41]  Marco Laumanns,et al.  Performance assessment of multiobjective optimizers: an analysis and review , 2003, IEEE Trans. Evol. Comput..

[42]  Hirotaka Nakayama,et al.  Satisficing Trade-off Method for Multiobjective Programming , 1984 .

[43]  Jason Weston,et al.  Transductive Inference for Estimating Values of Functions , 1999, NIPS.

[44]  P. Pardalos,et al.  Optimization and optimal control , 2003 .

[45]  Yaochu Jin,et al.  Managing approximate models in evolutionary aerodynamic design optimization , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[46]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[47]  Juan Julián Merelo Guervós,et al.  Parallel Problem Solving from Nature — PPSN VII , 2002, Lecture Notes in Computer Science.

[48]  Andy J. Keane,et al.  Statistical Improvement Criteria for Use in Multiobjective Design Optimization , 2006 .

[49]  Marco Laumanns,et al.  Bayesian Optimization Algorithms for Multi-objective Optimization , 2002, PPSN.

[50]  Tapabrata Ray,et al.  Surrogate Assisted Evolutionary Algorithm for Multiobjective Optimization , 2006 .

[51]  David Joslin,et al.  Opportunistic Fitness Evaluation in a Genetic Algorithm for Civil Engineering Design Optimization , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[52]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[53]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[54]  Andrew W. Moore,et al.  Locally Weighted Learning for Control , 1997, Artificial Intelligence Review.

[55]  Ryszard S. Michalski,et al.  LEARNABLE EVOLUTION MODEL: Evolutionary Processes Guided by Machine Learning , 2004, Machine Learning.

[56]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[57]  Yaochu Jin,et al.  A comprehensive survey of fitness approximation in evolutionary computation , 2005, Soft Comput..

[58]  S. Selcuk Erenguc,et al.  Survey of mathematical programming models and experimental results for linear discriminant analysis , 1990 .

[59]  Panos Y. Papalambros,et al.  Metamodeling sampling criteria in a global optimization framework , 2000 .

[60]  Hirotaka Nakayama,et al.  Generating Support Vector Machines Using Multi-Objective Optimization and Goal Programming , 2006, Multi-Objective Machine Learning.

[61]  Michael T. M. Emmerich,et al.  Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels , 2006, IEEE Transactions on Evolutionary Computation.

[62]  António Gaspar-Cunha,et al.  A Multi-Objective Evolutionary Algorithm Using Neural Networks to Approximate Fitness Evaluations , 2005, Int. J. Comput. Syst. Signals.

[63]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[64]  Martin J. Oates,et al.  On Fitness Distributions and Expected Fitness Gain of Mutation Rates in Parallel Evolutionary Algorithms , 2002, PPSN.

[65]  Marco Laumanns,et al.  Performance assessment of multiobjective optimizers , 2002 .