Learning the higher-order structure of a natural sound.

Unsupervised learning algorithms paying attention only to second-order statistics ignore the phase structure (higher-order statistics) of signals, which contains all the informative temporal and spatial coincidences which we think of as 'features'. Here we discuss how an Independent Component Analysis (ICA) algorithm may be used to elucidate the higher-order structure of natural signals, yielding their independent basis functions. This is illustrated with the ICA transform of the sound of a fingernail tapping musically on a tooth. The resulting independent basis functions look like the sounds themselves, having similar temporal envelopes and the same musical pitches. Thus they reflect both the phase and frequency information inherent in the data.

[1]  M. C. GOODALL,et al.  Performance of a Stochastic Net , 1960, Nature.

[2]  J. Daugman Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filters. , 1985, Journal of the Optical Society of America. A, Optics and image science.

[3]  D J Field,et al.  Relations between the statistics of natural images and the response properties of cortical cells. , 1987, Journal of the Optical Society of America. A, Optics and image science.

[4]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[5]  Terence D. Sanger,et al.  Optimal unsupervised learning in a single-layer linear feedforward neural network , 1989, Neural Networks.

[6]  Joseph J. Atick,et al.  Towards a Theory of Early Visual Processing , 1990, Neural Computation.

[7]  Christian Jutten,et al.  Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture , 1991, Signal Process..

[8]  Nathan Intrator,et al.  Feature Extraction Using an Unsupervised Neural Network , 1992, Neural Computation.

[9]  Erkki Oja,et al.  Principal components, minor components, and linear neural networks , 1992, Neural Networks.

[10]  Joseph J. Atick,et al.  Convergent Algorithm for Sensory Receptive Field Development , 1993, Neural Computation.

[11]  J. Nadal Non linear neurons in the low noise limit : a factorial code maximizes information transferJean , 1994 .

[12]  J. Nadal,et al.  Nonlinear neurons in the low-noise limit: a factorial code maximizes information transfer Network 5 , 1994 .

[13]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[14]  David J. Field,et al.  What Is the Goal of Sensory Coding? , 1994, Neural Computation.

[15]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[16]  J. Karhunen,et al.  Neural Estimation of Basis Vectors in Independent Component Analysis , 1995 .

[17]  A. J. Bell,et al.  Fast blind separation based on information theory , 1995 .

[18]  D. Field,et al.  Natural Image Statistics and Eecient Coding , 1996 .

[19]  R W Prager,et al.  Development of low entropy coding in a recurrent network. , 1996, Network.

[20]  D. Field,et al.  Natural image statistics and efficient coding. , 1996, Network.