Processing of Visual and Auditory Space and Its Modification by Experience

Visual spatial information is projected from the retina to the brain in a highly topographic fashion, so that 2-D visual space is represented in a simple retinotopic map. Auditory spatial information, by contrast, has to be computed from binaural time and intensity differences as well as from monaural spectral cues produced by the head and ears. Evaluation of these cues in the central nervous system leads to the generation of neurons that are sensitive to the location of a sound source in space ("spatial tuning") and, in some animal species, to auditory space maps where spatial location is encoded as a 2-D map just like in the visual system. The brain structures thought to be involved in the multimodal integration of visual and auditory spatial integration are the superior colliculus in the midbrain and the inferior parietal lobe in the cerebral cortex.

[1]  T. Sejnowski,et al.  Egocentric Spaw Representation in Early Vision , 1993, Journal of Cognitive Neuroscience.

[2]  Richard A. Andersen,et al.  Coordinate transformations in the representation of spatial information , 1993, Current Opinion in Neurobiology.

[3]  J. Rauschecker,et al.  Auditory Localization Behaviour in Visually Deprived Cats , 1994, The European journal of neuroscience.

[4]  J. Rauschecker,et al.  Auditory compensation for early blindness in cat cerebral cortex , 1993, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[5]  Terrence J. Sejnowski,et al.  Egocentric Spatial Representation in Early Vision , 1993 .

[6]  Eric I. Knudsen,et al.  A Connectionist Model of the Owl's Sound Localization System , 1993, NIPS.