Introduction
Scientists have researched the integration of visual and auditory spatial information for many decades. Through this research, the scientific community has acquired knowledge pertaining to the benefit that the brain receives from combining both visual and spatial information. The benefit from the contribution of both modalities in terms of spatial localization results because both audition and vision provide distinctive and complimentary information to the brain. Although scientific evidence can confirm that, through direct projection, vision provides reliable and accurate information for spatial localization. Audition is also very important in terms of spatial localization due to the broad range of information it can provide about the location of a desired signal in any direction.
Audition provides invaluable information when a visual stimulus is not available or the visual stimulus is hidden or camouflaged. Spatial localization that results from the combination of both modalities is more reliable than when only using either modality in isolation. The following paper will discuss research that verified visual dominance in spatial localization, with additional evidence that supports how important audition is in terms of spatial localization.
Background
Visual Dominance
Those who have been exposed to a ventriloquist act have observed the effects of visual dominance in spatial localization. This phenomenon occurs when a human identifies the auditory stimulus as initiating from near the visual stimulus, when the stimuli are inharmonious (Knudsen & Brainard 1995). Visual localization can even influence the localization of an auditory stimulus when visual and auditory stimuli are es...
... middle of paper ...
...edith, M. A., & Stein, B. E. (1986). Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. Journal of Neurophysiology, 56(3), 640-662.
Rauschecker, J., & Harris, L. (1983). Auditory compensation of the effects of visual deprivation in the cat's superior colliculus. Experimental Brain Research, 50(1), 69-83.
Rauschecker, J. P. (1993). Auditory compensation for early blindness in cat cerebral cortex. The Journal of Neuroscience, 13(10), 4538.
THURLOW, W. R. (1976). Further study of existence regions for the" ventriloquism effect". Journal of the American Audiology Society, 1(6), 280.
Wallace, M. T., Meredith, M. A., & Stein, B. E. (1993). Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. Journal of Neurophysiology, 69(6), 1797-1809.
40. The earliest vision researchers, Hubel and Wiesel, discovered that neurons in the rear occipital cortex of cats respond only to:
Sullivan, G. D., Georgeson, M. A., & Oatley, K. (1972). Channels for spatial frequency selection and detection of single bars by the human visual system. Vision Research, 12, 383-94.
As Table 1 shows, the mean reaction time to visual stimulus is greater than the mean reaction time to auditory stimulus. The chi-squared value of 9.600 in Table 2 allows us to reject our null hypothesis that there is no difference between auditory and visual reaction times. This result is consistent with our predicted outcome and it also supports Brebner and Welford (1980). Reaction time to a stimulus depends on many factors, including the reception of the stimulus by the eyes, the transmission of a neural signal to the brain, muscular activation, and finally, the physical reaction to the stimulus (Pain and Hibbs, 2007). The reaction times to auditory stimulus were shorter than to visual stimulus, implying that the auditory stimulus reaches the motor cortex
Neurology. April 1992. suppl 4.. Pp. 5-7. Rowland, L. P., ed.
Kanske, P., Heissler, J., Schönfelder, S., Forneck, J., & Wessa, M. (2013). Neural correlates of
The environment provides the information necessary for the equilibrium center to determine which position to place the body in. There are three main places in which information is received: the eyes provide visual information, the ears provide vestibular and auditory information, and the articulations provide proprioceptive information. In general, the eyes help position the body according to different horizontal angles in relation to the ground. The ears allow the body to acknowledge any type of movement, such as acceleration or deceleration, by registering various sounds (1). Movement is also processed in parts of the brain, as well as in the ears.
National Institute on Deafness and Other Communication Disorders. (November 2002). Retrieved October 17, 2004, from http://www.nidcd.nih.gov/health/hearing/coch.asp
... Parsons, L.M., Bower, J., Xiong J., Li J., & Fox, P. (1996). Cerebellum Implicated in Sensory Acquisition and Discrimination Rather Than Motor Control. Science, 272, 545-547.
M.M. Merzenich, J. K. (1983). Topographical reorganization of somatosensory cortial areas 3b and 1 in adult monkeys following restrictive deafferentation. Neuroscience, 33-55.
...owell, E. R., Thompson, P. M., & Toga, A. W. (2004). Mapping changes in the human cortex
The 'Secondary' of the 'Secondary'. A physiological correlate of the “zoom lens” of visual attention. The Journal of Neuroscience, 23(9): 3561-3565. Shinn-Cunningham, B. G. (2008). Object-based auditory and visual attention.
...I) to show activation in the dorsal cortex during unconscious perception. Therefore, if neuroimaging evidence demonstrates that dorsal stream is activated during unconscious processing, then this can strengthen the conclusions drawn from their experiment.
Surround inhibition receptive field is around the centre- surround organization ether in the retina of the eye or anywhere else. It was first discovered by Kuffler in 1953 and it has been studied widely in the retinal ganglion cell in vertebrates naming from fish to monkeys (mammals) to birds to amphibians (PATRICK K 2002). Surround inhibition works as a neural mechanism that sharpens the sensation and focuses neural activities in the central nervous system (PATRICK K 2002). This is well known in somatosensory system where central signals are made easy ...
Sounds automatically produce conscious visual and auditory experiences in auditory-visual synesthesia. Direct auditory-visual percepts may play a functional role in multisensory processing, which may give rise to synesthesia-like illusion or illusory flash. The illusion occurs predominantly in peripheral vision, and is accompanied by electrical activity over occipital sites (Oz, O1, and O2) (Shams et al., 2001). The cross-modal transfer hypothesis assumes that connections between auditory and visual regions are indirect and are mediated by multisensory audiovisual brain regions (Goller et al., 2009). Multisensory processes may be activated when two senses are stimulated or by a unimodal stimulus such as synesthesia (Goller et al., 2009). This
Most attention studies look at a one-on-one situation in which one sound is presented with one visual image [1]. However, we know that in an environment that is teeming with an overwhelming amount of visual, auditory, gustation, olfaction, and somatosensory input into the body. So while a one-to-one relationship in regards to vision and visual stimulus may provide a sound starting point for investigations into visual attention it may not pose the most significant results for real world application. I feel that a much more realistic, in regards to actual experiences we face, should explore how other systems interact with the visual system, and eventually, exploring how the integration of many sensory systems plays a role in visual attention.