Fitness Landscape and Evolutionary Boolean Synthesis using Information
暂无分享,去创建一个
In this paper we show how information theory concepts can be used in evolutionary circuit design and minimization problems. Conditional entropy, mutual information, and normalized mutual information are commonly used to measure or estimate the amount of information shared by two random variables. Although the simple number reported by these measures may guide the evolutionary search, we show that normalized mutual information produces more amenable fitness landscape for search than the others. Several landscape plots and experiments are used to support and explain our main argument.
[1] David E. Goldberg,et al. From Twomax To The Ising Model: Easy And Hard Symmetrical Problems , 2002, GECCO.
[2] Carlos A. Coello Coello,et al. A genetic programming approach to logic function synthesis by means of multiplexers , 1999, Proceedings of the First NASA/DoD Workshop on Evolvable Hardware.
[3] Colin Studholme,et al. An overlap invariant entropy measure of 3D medical image alignment , 1999, Pattern Recognit..
[4] Thomas M. Cover,et al. Elements of Information Theory , 2005 .