Not a member yet? Register for full benefits!

Multiple Sensory Modalities Proven to Reinforce Each Other

Usually when we think of sensory input channels - sight, sound, touch, taste, smell, balance - we think of the traditional hierarchical order, under the brain.

It seems that that hierarchy is not as cut and dried as we once thought, with interesting implications for virtual worlds. The senses interlink below the brain. In other words, the senses amplify one another.

A study published in the online open access journal BMC Neuroscience, shows that, in monkeys, the region involved in hearing can directly improve perception in the visual region, without the involvement of other structures to integrate the senses. In other words, hearing a familiar sound that is associated either instinctually or ingrained with a given visual stimulus, automatically amplifies the visual counterpart. The visual sense seeks out the counterpart visual element, expecting it to be there, and locks onto it, over stronger visual signals that may also be present.

Hear something that indicates a sight should be nearby, and you will see it, in other words. The same also works in reverse.

"Auditory or visual-auditory responses in the primary visual cortex are highly probable given the presence of direct projections from the primary auditory cortex", explain P. Barone and colleagues from the Centre for Brain and Cognition Research, Toulouse, France. "We looked for modulation of the neuronal visual responses in the primary visual cortex by auditory stimuli in an awake monkey."

The researchers recorded the neuronal responses with microelectrodes inserted directly into the primary visual cortex of a rhesus macaque. The monkey was then required to orient its gaze towards a visual stimulus. The time taken for the neurons in the visual cortex to respond to the stimulus, or latency, was recorded. Barone and colleagues then measured the latency when the visual stimulus was accompanied by a sound emanating from the same spot. When the visual signal was strong - i.e., high contrast - the auditory stimulus did not affect latency; however, if the visual signal was weaker - i.e., low contrast - latency decreased by 5-10%, suggesting that in some way the auditory stimulus speeds up the response to the visual stimulus.

So, from a VR point of view, there is even more reason than there was before, to ensure that actions in your environment, are transmitted over multiple sensory channels. Because one sense locks onto the signal detected by another, it allows you to use this to redirect attention when it is close to the edge of the immersiveness of the simulation. This process is also part of an explanation for the phenomenon of rubber body part illusions.


Sound Adds Speed To Visual Perception

Staff Comments


Untitled Document .