Fitness Landscape and Evolutionary Boolean Synthesis using Information

In this paper we show how information theory concepts can be used in evolutionary circuit design and minimization problems. Conditional entropy, mutual information, and normalized mutual information are commonly used to measure or estimate the amount of information shared by two random variables. Although the simple number reported by these measures may guide the evolutionary search, we show that normalized mutual information produces more amenable fitness landscape for search than the others. Several landscape plots and experiments are used to support and explain our main argument.