A CURSORY INTRODUCTION TO THE PHYSICISTS' NEURAL NETWORKS

Resume - Nous presentons une tres courte introduction aux reseaux de neurones formels, mettant en relief les principaux problemes et les voies de recherche actueltes, et indiquant quelques references bibliographiques. Abstract - The paper is a very short introduction to networks of formal neurons, intended to outline the main problems and trends and provide some basic references. The first attempts at designing machines inspired from the nervous system were made in the 19601s, but these attempts were only partly successful: limitations became soon apparent, that could not be overcome with the technical and conceptual tools available at that time 131. The level of activity in the field remained fairly low for some twenty years. In 1982, the physicist John J. Hopfield published his seminal paper in which he recognized the analogy between Hebb's cell assemblies and disordered magnetic systems 141. The proof that the proposed model was analytically solvable 151 triggered a large research effort in recent years. The number of papers on neural networks, particularly of the Hopfield type, originating from the physics community, is probably well over one hundred at the present day 161. An even larger number of papers, many of which are authored by computer scientists or electrical engineers, describe potential applications of various types of neural networks 151, in such fields as artificial vision, speech understanding and synthesis, combinatorial optimization, interpolation and prediction ... The elementary component in most of these models is the McCulloch-Pitts formal neuron, or a somewhat elaborate version thereof. Such processing units are infinitely less complex than real neurons. It is, however, the very simplicity of these models which makes them attractive, for the following reasons: - as mentioned above, the tools of statistical physics can be used to analyze and predict the behaviour of large assemblies of formal neurons; one may in particular investigate the information storage properties and the dynamics of learning and retrieval of such networks acting as associative memories;