Nonlinear neurons in the low-noise limit: a factorial code maximizes information transfer Network 5

We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focusing on the case of nonlinear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environment. The main result is that, for bounded and invertible transfer functions, in the case of a vanishing additive output noise, and no input noise, maximization of information (Linsker's infomax principle) leads to a factorial code-hence to the same solution as required by the redundancy-reduction principle of Barlow. We also show that this result is valid for linear and, more generally, unbounded, transfer functions, provided optimization is performed under an additive constraint, i.e. which can be written as a sum of terms, each one being specific to one output neuron. Finally, we study the effect of a non-zero input noise. We find that, to first order in the input noise, assumed to be small in comparison with th...

[1]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[2]  Claude E. Shannon,et al.  The Mathematical Theory of Communication , 1950 .

[3]  F. Attneave Some informational aspects of visual perception. , 1954, Psychological review.

[4]  S. Laughlin A Simple Coding Procedure Enhances a Neuron's Information Capacity , 1981, Zeitschrift fur Naturforschung. Section C, Biosciences.

[5]  D J Field,et al.  Relations between the statistics of natural images and the response properties of cortical cells. , 1987, Journal of the Optical Society of America. A, Optics and image science.

[6]  Richard E. Blahut,et al.  Principles and practice of information theory , 1987 .

[7]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[8]  Zee,et al.  Understanding the efficiency of human perception. , 1988, Physical review letters.

[9]  H. B. Barlow,et al.  Finding Minimum Entropy Codes , 1989, Neural Computation.

[10]  Peter Földiák,et al.  Adaptation and decorrelation in the cortex , 1989 .

[11]  William Bialek,et al.  Reading a Neural Code , 1991, NIPS.

[12]  Joseph J. Atick,et al.  Towards a Theory of Early Visual Processing , 1990, Neural Computation.

[13]  Christian Jutten,et al.  Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture , 1991, Signal Process..

[14]  J J Hopfield,et al.  Olfactory computation and object perception. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[15]  Ralph Linsker,et al.  Local Synaptic Learning Rules Suffice to Maximize Mutual Information in a Linear Network , 1992, Neural Computation.

[16]  Ralph Linsker,et al.  Deriving Receptive Fields Using an Optimal Encoding Criterion , 1992, NIPS.

[17]  Schuster Hg Learning by maximizing the information transfer through nonlinear noisy neurons and "noise breakdown , 1992 .

[18]  Zhaoping Li,et al.  Understanding Retinal Color Coding from First Principles , 1992, Neural Computation.

[19]  Joseph J. Atick,et al.  What Does the Retina Know about Natural Scenes? , 1992, Neural Computation.

[20]  Gilles Burel,et al.  Blind separation of sources: A nonlinear neural algorithm , 1992, Neural Networks.

[21]  John C. Russ,et al.  The Image Processing Handbook , 2016, Microscopy and Microanalysis.

[22]  Néstor Parga,et al.  Information processing by a perceptron in an unsupervised learning task , 1993 .

[23]  William Bialek,et al.  Statistics of Natural Images: Scaling in the Woods , 1993, NIPS.

[24]  Joseph J. Atick,et al.  Convergent Algorithm for Sensory Receptive Field Development , 1993, Neural Computation.

[25]  A. Norman Redlich,et al.  Redundancy Reduction as a Strategy for Unsupervised Learning , 1993, Neural Computation.

[26]  Néstor Parga,et al.  Duality Between Learning Machines: A Bridge Between Supervised and Unsupervised Learning , 1994, Neural Computation.

[27]  Zhaoping Li,et al.  Toward a Theory of the Striate Cortex , 1994, Neural Computation.

[28]  Zhaoping Li,et al.  Efficient stereo coding in the multiscale representation , 1994 .

[29]  François Chapeau-Blondeau Information entropy maximization in the transmission by a neuron nonlinearity , 1994 .

[30]  Daniel L. Ruderman,et al.  Designing receptive fields for highest fidelity , 1994 .

[31]  Ralph Linsker,et al.  Sensory Processing and Information Theory , 1994 .

[32]  Zhaoping Li,et al.  Towards a theory of striate cortex , 1994 .

[33]  J. Rospars,et al.  Coding of odour quality: roles of convergence and inhibition , 1994 .