Convergence Models of Genetic Algorithm Selection Schemes

We discuss the use of normal distribution theory as a tool to model the convergence characteristics of different GA selection schemes. The models predict the proportion of optimal alleles in function of the number of generations when optimizing the bit-counting function. The selection schemes analyzed are proportionate selection, tournament selection, truncation selection and elitist recombination. Simple yet accurate models are derived that have only a slight deviation from the experimental results. It is argued that this small difference is due to the build-up of covariances between the alleles — a phenomenon called linkage disequilibrium in quantitative genetics. We conclude with a brief discussion of this linkage disequilibrium.