Initialization and Displacement of the Particles in TRIBES, a Parameter-Free Particle Swarm Optimization Algorithm

This chapter presents two ways of improvement for TRIBES, a parameter-free Particle Swarm Optimization (PSO) algorithm. PSO requires the tuning of a set of parameters, and the performance of the algorithm is strongly linked to the values given to the parameter set. However, finding the optimal set of parameters is a very hard and time consuming problem. So, Clerc worked out TRIBES, a totally adaptive algorithm that avoids parameter fitting. Experimental results are encouraging but are still worse than many algorithms. The purpose of this chapter is to demonstrate how TRIBES can be improved by choosing a new way of initialization of the particles and by hybridizing it with an Estimation of Distribution Algorithm (EDA). These two improvements aim at allowing the algorithm to explore as widely as possible the search space and avoid a premature convergence in a local optimum. Obtained results show that, compared to other algorithms, the proposed algorithm gives results either equal or better.

[1]  Xiao-Feng Xie,et al.  Adaptive particle swarm optimization on individual level , 2002, 6th International Conference on Signal Processing, 2002..

[2]  Hidefumi Sawai,et al.  Genetic algorithm inspired by gene duplication , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[3]  Saku Kukkonen,et al.  Real-parameter optimization with differential evolution , 2005, 2005 IEEE Congress on Evolutionary Computation.

[4]  Mauro Birattari,et al.  Swarm Intelligence , 2012, Lecture Notes in Computer Science.

[5]  Maurice Clerc,et al.  L'optimisation par essaim particulaire , 2002, Techniques et sciences informatiques.

[6]  Francisco Herrera,et al.  Adaptive local search parameters for real-coded memetic algorithms , 2005, 2005 IEEE Congress on Evolutionary Computation.

[7]  Riccardo Poli,et al.  Particle Swarm Optimisation , 2011 .

[8]  Daniele Peri,et al.  Particle Swarm Optimization: efficient globally convergent modifications , 2006 .

[9]  Yutian Liu,et al.  An adaptive PSO algorithm for reactive power optimization , 2003 .

[10]  Marcus Gallagher,et al.  Experimental results for the special session on real-parameter optimization at CEC 2005: a simple, continuous EDA , 2005, 2005 IEEE Congress on Evolutionary Computation.

[11]  Maurice Clerc,et al.  The particle swarm - explosion, stability, and convergence in a multidimensional complex space , 2002, IEEE Trans. Evol. Comput..

[12]  Sabre Kais,et al.  Pivot method for global optimization , 1997 .

[13]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[14]  Baruch Rosenstein,et al.  Dynamical analysis of survival of Kosterlitz-Thouless pairs due to pinning , 1997 .

[15]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[16]  M. Clerc,et al.  Particle Swarm Optimization , 2006 .

[17]  Keiichiro Yasuda,et al.  Adaptive particle swarm optimization using velocity information of swarm , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[18]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[19]  Frans van den Bergh,et al.  An analysis of particle swarm optimizers , 2002 .

[20]  Ioan Cristian Trelea,et al.  The particle swarm optimization algorithm: convergence analysis and parameter selection , 2003, Inf. Process. Lett..