Computations Inspired from the Brain

The brain is a highly complex system composed of a vast number of neurons. It uses various levels of hierarchy, consisting of the molecular level, cellular level, network level, system level, and more abstract level of mind. The principles of computation are largely different from those of the conventional computation where symbols are processed by logical calculations. Information is represented spatio-temporal patterns in a distributed manner, and calculations are emergent, stochastic and cooperative, rather than exact logics. It also includes learning and memory. We show miscellaneous topics of computation related to neural information processing: First one is statistical neurodynamics which is computation by randomly connected neurons. We give the robustness of computation, and its stability by using a simple stochastic law. This also reveals that neural computation is shallow, converging quickly to stable states, where the small-world phenomenon is observed. The cortices are regarded as two-dimensional layered neural fields, where computation may be realized by the dynamics of spatio-temporal pattern formation and synchronization. We show simple examples of pattern formation and collisions of patterns in neural fields. Learning and memory is another peculiar computation of a neural system. We also show some mechanisms of self-organization. An associative memory model shows fundamentally different aspects of memory from the conventional one, where memory patters are not recalled from the stock but generated newly each time. The basin of attraction has a fractal structure. We finally touch upon social computation by using the prisoner's dilemma and ultimatum games.