Analyzing convergence performance of evolutionary algorithms: A statistical approach

The analysis of the performance of different approaches is a staple concern in the design of Computational Intelligence experiments. Any proper analysis of evolutionary optimization algorithms should incorporate a full set of benchmark problems and state-of-the-art comparison algorithms. For the sake of rigor, such an analysis may be completed with the use of statistical procedures, supporting the conclusions drawn. In this paper, we point out that these conclusions are usually limited to the final results, whereas intermediate results are seldom considered. We propose a new methodology for comparing evolutionary algorithms’ convergence capabilities, based on the use of Page’s trend test. The methodology is presented with a case of use, incorporating real results from selected techniques of a recent special issue. The possible applications of the method are highlighted, particularly in those cases in which the final results do not enable a clear evaluation of the differences among several evolutionary techniques.

[1]  Maliha S. Nash,et al.  Handbook of Parametric and Nonparametric Statistical Procedures , 2001, Technometrics.

[2]  Francisco Herrera,et al.  Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power , 2010, Inf. Sci..

[3]  Subhabrata Chakraborti,et al.  Nonparametric Statistical Inference , 2011, International Encyclopedia of Statistical Science.

[4]  P. N. Suganthan,et al.  Differential Evolution: A Survey of the State-of-the-Art , 2011, IEEE Transactions on Evolutionary Computation.

[5]  S. García,et al.  An Extension on "Statistical Comparisons of Classifiers over Multiple Data Sets" for all Pairwise Comparisons , 2008 .

[6]  Stergios B. Fotopoulos,et al.  Introduction to Modern Nonparametric Statistics , 2004, Technometrics.

[7]  Francisco Herrera,et al.  A study on the use of statistical tests for experimentation with neural networks: Analysis of parametric test conditions and non-parametric tests , 2007, Expert Syst. Appl..

[8]  Bernd Bischl,et al.  Algorithm selection based on exploratory landscape analysis and cost-sensitive learning , 2012, GECCO '12.

[9]  D. Sheskin Handbook of Parametric and Nonparametric Statistical Procedures: Third Edition , 2000 .

[10]  Xin Yao,et al.  Scalability of generalized adaptive differential evolution for large-scale continuous optimization , 2010, Soft Comput..

[11]  Francisco Herrera,et al.  Editorial scalability of evolutionary algorithms and other metaheuristics for large-scale continuous optimization problems , 2011, Soft Comput..

[12]  Marjan Mernik,et al.  Replication and comparison of computational experiments in applied evolutionary computing: Common pitfalls and guidelines to avoid them , 2014, Appl. Soft Comput..

[13]  E. B. Page Ordered Hypotheses for Multiple Treatments: A Significance Test for Linear Ranks , 1963 .

[14]  Peter A. N. Bosman,et al.  On Gradients and Hybrid Evolutionary Algorithms for Real-Valued Multiobjective Optimization , 2012, IEEE Transactions on Evolutionary Computation.

[15]  Francisco Herrera,et al.  A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms , 2011, Swarm Evol. Comput..

[16]  Ponnuthurai Nagaratnam Suganthan,et al.  Benchmark Functions for the CEC'2013 Special Session and Competition on Large-Scale Global Optimization , 2008 .

[17]  Bernd Bischl,et al.  Exploratory landscape analysis , 2011, GECCO '11.

[18]  Héctor Pomares,et al.  Statistical analysis of the main parameters involved in the design of a genetic algorithm , 2002, IEEE Trans. Syst. Man Cybern. Part C.

[19]  Francisco Herrera,et al.  A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 Special Session on Real Parameter Optimization , 2009, J. Heuristics.

[20]  Marcus Gallagher,et al.  A general-purpose tunable landscape generator , 2006, IEEE Transactions on Evolutionary Computation.

[21]  Václav Snásel,et al.  On convergence of multi-objective Particle Swarm Optimizers , 2010, IEEE Congress on Evolutionary Computation.

[22]  Ville Tirronen,et al.  Shuffle or update parallel differential evolution for large-scale optimization , 2011, Soft Comput..

[23]  Ponnuthurai N. Suganthan,et al.  Self-adaptive differential evolution with multi-trajectory search for large-scale optimization , 2011, Soft Comput..

[24]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[25]  W. J. Conover,et al.  Practical Nonparametric Statistics , 1972 .

[26]  B. C. Brookes,et al.  Information Sciences , 2020, Cognitive Skills You Need for the 21st Century.

[27]  Janez Brest,et al.  Self-Adapting Control Parameters in Differential Evolution: A Comparative Study on Numerical Benchmark Problems , 2006, IEEE Transactions on Evolutionary Computation.

[28]  Michael D. Vose,et al.  No Free Lunch and Benchmarks , 2013, Evolutionary Computation.

[29]  L. Darrell Whitley,et al.  Evaluating Evolutionary Algorithms , 1996, Artif. Intell..

[30]  Thomas Bartz-Beielstein,et al.  Experimental Research in Evolutionary Computation - The New Experimentalism , 2010, Natural Computing Series.

[31]  Marjan Mernik,et al.  A chess rating system for evolutionary algorithms: A new method for the comparison and ranking of evolutionary algorithms , 2014, Inf. Sci..

[32]  Janez Brest,et al.  Self-adaptive differential evolution algorithm using population size reduction and three strategies , 2011, Soft Comput..

[33]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[34]  John N. Hooker,et al.  Testing heuristics: We have it all wrong , 1995, J. Heuristics.

[35]  Günter Rudolph,et al.  Convergence analysis of canonical genetic algorithms , 1994, IEEE Trans. Neural Networks.

[36]  Zhijian Wu,et al.  Enhanced opposition-based differential evolution for solving high-dimensional continuous optimization problems , 2011, Soft Comput..

[37]  Juan Julián Merelo Guervós,et al.  Using statistical tools to determine the significance and relative importance of the main parameters of an evolutionary algorithm , 2013, Intell. Data Anal..

[38]  Ritu Gupta,et al.  Statistical exploratory analysis of genetic algorithms , 2004, IEEE Transactions on Evolutionary Computation.

[39]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[40]  Mehmet Fatih Tasgetiren,et al.  Differential evolution algorithm with ensemble of parameters and mutation strategies , 2011, Appl. Soft Comput..

[41]  Thomas Bartz-Beielstein,et al.  Experimental research in evolutionary computation , 2007, GECCO '07.

[42]  Francisco Herrera,et al.  A study of statistical techniques and performance measures for genetics-based machine learning: accuracy and interpretability , 2009, Soft Comput..

[43]  Ponnuthurai N. Suganthan,et al.  Comprehensive comparison of convergence performance of optimization algorithms based on nonparametric statistical tests , 2012, 2012 IEEE Congress on Evolutionary Computation.

[44]  P. N. Suganthan,et al.  Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization , 2009, IEEE Transactions on Evolutionary Computation.