Parameter investigation in brain storm optimization

Human being is the most intelligent organism in the world and the brainstorming process popularly used by them has been demonstrated to be a significant and promising way to create great ideas for problem solving. Brain storm optimization (BSO) is a new kind of swarm intelligence algorithm inspired by human being creative problem solving process. BSO transplants the brainstorming process in human being into optimization algorithm design and gains successes. BSO generally uses the grouping, replacing, and creating operators to produce ideas as many as possible to approach the problem solution generation by generation. In these operators, BSO involves mainly three control parameters named: (1) p_replce to control the replacing operator; (2) p_one to control the creating operator to create new ideas between one cluster and two clusters; and (3) p_center (p_one_center and p_two_center) to control using cluster center or random idea to create new idea. In this paper, we make investigations on these parameters to see how they affect the performance of BSO. More importantly, a new BSO variant designed according to the investigation results is proposed and its performance is evaluated.

[1]  Yuhui Shi,et al.  Brain Storm Optimization Algorithm with Modified Step-Size and Individual Generation , 2012, ICSI.

[2]  Jun Zhang,et al.  Clustering-Based Adaptive Crossover and Mutation Probabilities for Genetic Algorithms , 2007, IEEE Transactions on Evolutionary Computation.

[3]  Yuhui Shi,et al.  Brain Storm Optimization Algorithm for Multi-objective Optimization Problems , 2012, ICSI.

[4]  Jun Zhang,et al.  An Efficient Ant Colony System Based on Receding Horizon Control for the Aircraft Arrival Sequencing and Scheduling Problem , 2010, IEEE Transactions on Intelligent Transportation Systems.

[5]  Jiannong Cao,et al.  Multiple Populations for Multiple Objectives: A Coevolutionary Technique for Solving Multiobjective Optimization Problems , 2013, IEEE Transactions on Cybernetics.

[6]  Jun Zhang,et al.  Orthogonal Learning Particle Swarm Optimization , 2009, IEEE Transactions on Evolutionary Computation.

[7]  Jun Zhang,et al.  An Ant Colony Optimization Approach to a Grid Workflow Scheduling Problem With Various QoS Requirements , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[8]  Yuhui Shi,et al.  Brain Storm Optimization Algorithm , 2011, ICSI.

[9]  Arthur C. Sanderson,et al.  JADE: Adaptive Differential Evolution With Optional External Archive , 2009, IEEE Transactions on Evolutionary Computation.

[10]  Yuhui Shi,et al.  An Optimization Algorithm Based on Brainstorming Process , 2011, Int. J. Swarm Intell. Res..

[11]  Jun Zhang,et al.  A Novel Set-Based Particle Swarm Optimization Method for Discrete Optimization Problems , 2010, IEEE Transactions on Evolutionary Computation.

[12]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[13]  Aaas News,et al.  Book Reviews , 1893, Buffalo Medical and Surgical Journal.

[14]  Zhi-hui Zhan,et al.  A modified brain storm optimization , 2012, 2012 IEEE Congress on Evolutionary Computation.

[15]  Jun Zhang,et al.  Adaptive Particle Swarm Optimization , 2008, ANTS Conference.

[16]  Mauro Birattari,et al.  Swarm Intelligence , 2012, Lecture Notes in Computer Science.

[17]  Bogdan Filipic,et al.  The differential ant-stigmergy algorithm , 2012, Inf. Sci..

[18]  Jun Zhang,et al.  Self-adaptive differential evolution based on PSO learning strategy , 2010, GECCO '10.

[19]  Jun Zhang,et al.  Evolutionary Computation Meets Machine Learning: A Survey , 2011, IEEE Computational Intelligence Magazine.