Not a member yet? Register for full benefits!

Control of Your Hands is in Your Eyes

An interesting study from Northwestern University has implications for interactivity and control systems of both augmented reality and full embodiment virtuality. It seems that a lot of the tasks we do throughout the day are not just on autopilot as was previously known, but the brain actively hides the autopilot functions from conscious control.

"Our study gives a salient example," said Yangqing 'Lucie' Xu, lead author of the study and a doctoral candidate in psychology at Northwestern. "When you pick up an object, your brain automatically decides how to control your muscles based on what your eyes provide about the object's shape. When you pick up a mug by the handle with your right hand, you need to add a clockwise twist to your grip to compensate for the extra weight that you see on the left side of the mug.

"We showed that the use of this visual information is so powerful and automatic that we cannot turn it off. When people see an object weighted in one direction, they actually can't help but 'feel' the weight in that direction, even when they know that we're tricking them," Xu said.

The implications for haptics are obvious. It is a potential visual method of complimenting a haptic interface such that the user experiences weight beyond the interface's capability to recreate, based solely on their visual associations and the brain's autopilot functionality taking over.

The researchers conducted two experiments. In the first, people were asked to grasp a vertical stick with a weight hanging either from its left side or from its right side. The volunteers were perfectly capable of determining which way the weight pulled, whether they looked at the stick, or they had their eyes closed and worked by feel alone.

The second experiment was more devious. The same stick and weights were used, but the researchers added a set of mirrors between the user's head and their hands holding the stick. They could no-longer see the stick directly, but could see it through the mirrors. Sometimes the mirrors reported a reversed image, and sometimes they identified it correctly. The volunteers had no way of knowing which was which.

The results were radically different. With their eyes closed, the volunteers all reported correctly as to which side the weight was, as they were operating by touch alone. But, run the experiment with their eyes open, and the results change dramatically. Every time the mirrors showed a flipped image, the eyes' visual input overrode the sense of touch, and the volunteers all reported the weight being on the opposite side to reality, when the mitrrors showed it to be so.

Even when the nature of the trick was explained to the volunteers in the second half of the second experiment, and they understood that the mirrors might not be telling the truth, their brains were still fooled. By not knowing when the image would be reversed and when it was not, the results showed that visual information consistently overrode haptic information alone.

Their sense of touch told them how heavy the weight was, but their eyes told them where the pull was coming from.

"People still could not ignore the visual information," said Xu. "In fact, the effect even works on us, and we designed the experiment!"

Steven Franconeri, co-author of the study and associate professor of cognitive psychology at Northwestern, said the brain is constantly making decisions for us that we don't know about or understand.

"These decisions are usually smart and based on vast experience," he said. "In this study's example, your brain is automatically using visual information to tell your hands what they are feeling. We can show that these decisions are happening by manipulating the information your brain receives -- we mirror-reverse the visual information and your brain now tells your hands that they are feeling the reverse of what they are actually feeling. This inference is mandatory -- you feel it even if you know it's not true."

Franconeri said this is not a "bug" in the brain's operation.

"In the vast majority of cases, you want to 'delegate' decisions like this to the unconscious parts of your brain, leaving you free to focus on less straightforward problems, like following driving directions or enjoying your cup of coffee."


When your eyes tell your hands what to think

Visual influence on haptic torque perception (paper)

Staff Comments


Untitled Document .