Deprecated: mysql_connect(): The mysql extension is deprecated and will be removed in the future: use mysqli or PDO instead in /home/virtualw/public_html/Resources/Hosted/Resource.php on line 9
Attention is what makes Sensory Data Stand Out
Not a member yet? Register for full benefits!

Username
Password
Attention is what makes Sensory Data Stand Out

The title above should come as no surprise to most people. However, we now have empirical scientific evidence to back this up. When you concentrate your attention or focus on a particular channel of sensory input, it comes in clear and sharp, regardless of the background information. The study focuses on the sense of sight, but what applies there, applies to all.

"We live with the illusion that our visual system processes all the information that is available in the visual scene in a single glimpse," says John H. Reynolds, Ph.D., an associate professor in the Systems Neurobiology Laboratory at the Salk Institute for Biological Studies. "In reality, there is far too much detail in a typical scene for the visual system to take it in all at once. So our perception of the world around us is in a sense pieced together from what we pay attention to."

This is of course important gor any virtual reality. If the user is not paying attention to all the details at once, then those details may not be necessary. If we can predict with any degree of accuracy which ones are being focussed upon at any given moment, we know when and what to ramp up the gidelity on, without having to increase it all, at once.

Researchers had known for some time that paying attention to visual details increases the firing rate of neurons tuned for attended stimulus. Until now, it was assumed that these attention-dependent increases in neural activity were the primary cause of the improvement in acuity experienced when a person focuses on stimulus. That is not the case.

"What we found is that attention also reduces background activity," says postdoctoral researcher and first author of the Salk study, Jude Mitchell, Ph.D. "We estimate that this noise reduction increases the fidelity of the neural signal by a factor that is as much as four times as large as the improvement caused by attention-dependent increases in firing rate. This reduction in noise may account for as much as 80% of the attention story."

When light hits the retina, visual information is translated into a cascade of nerve impulses sending signals deep into the brain. It is here, in the brain's visual cortex, which resides in the occipital lobe at the back of the skull, that these signals are interpreted and give rise to perception. But the visual system has limited capacity and cannot process everything that falls onto the retina. Instead, the brain relies on attention to bring details of interest into focus so it can select them out from background clutter.

In other words, when the user focuses on one input stream, their awareness of the others drops even lower. So, not only is there potential for relaxing the need to provide a continuously high fidelity to objects not being focussed on at any given time, but there is also a chance that the quality of the rest of the environment could be allowed to drop a bit, without affecting user experience.

But even under the most controlled laboratory conditions, the responses evoked by identically repeated stimuli vary from trial to trial. "Neurons are very noisy computing devices," says Mitchell. "Each neuron receives input from thousands of neurons and needs to distinguish the incoming information from the background noise."

Yet, when the researchers measured the activity of a large population of visual neurons in animals trained to play a simple video game that required rapt attention to a visual stimulus on the screen, they noticed the internal 'noise' of the visual neurons quietened down, with each neuron only receiving input from dozens of others. In other words, once the shift in attention had occurred, extra visual information was filtered out before it even reached the brain.

The area is seeing increased study for the prevention of attention disorders. This same work is obviously of increasing relevance to VR. That we know it happens and how it occurs is already valuable - savvy designers can use it to switch user attention from task to task to task, and know where the user's attention is at all times with reasonable certainty.

Of course, being able to track these attention shifts would be a much better way of dealing with the phenomenon, especially if it could be done in real-time. Sadly, that is still a ways off, but it does open up many possibilities for less smoke, less mirrors, and still retain the same grade of sensory illusion.

The findings of the Salk researchers were published in the September 24, 2009 issue of the journal Neuron.

References

Attention makes sensory signals stand out amidst the background noise in the brain

Neuron

Original Paper ( Spatial Attention Decorrelates Intrinsic Activity Fluctuations in Macaque Area V4 )

Staff Comments

 


.
Untitled Document .