Computing with structured connections networks. Technical report

Rapid advances both in the neurosciences and in computer science are beginning to lead to a new interest in computational models linking animal brains and behavior. In computer science, there is a large and growing body of knowledge about parallel computation and another, largely separate, science of artificial intelligence. The idea of looking directly at massively parallel realizations of intelligent activity promises to be fruitful for the study of both natural and artificial computations. Much attention has been directed towards the biological implications of this interdisciplinary effort, but there are equally important relations with computational theory, hardware and software. This article focuses on the design and use of massively parallel computational models, particularly in artificial intelligence. Much of the recent work on massively parallel computation has been carried out by physicists and examines the emergent behavior of large, unstructured collections of computing units. We are more concerned with how one can design, realize and analyze networks that embody the specific computational structures needed to solve hard problems. Adaptation and learning are treated as ways to improve structured networks, not as a replacement for analysis and design.