Not a member yet? Register for full benefits!

Eye-tracking as Virtual Environment Interaction for Disabilities

Eye tracking has grown up, it seems, and moved out of the larger VR environments such as CAVEs and hemisphere display systems arriving instead on the desktop, home installation. There, it serves much the same function - navigation and interaction via gaze control.

In essence, the exact same functionality as a mouse, but with eye movements. So, who would desire such an interface? Well, anyone who cannot properly use a mouse for starters. If you have ever witnessed an individual with limited muscle control in their hands, jerking about mouse movement, or making continual mistakes such as walking into walls and overshooting with keyboard movement, then you have seen the potential audience for such an interface.

The interface was created by COGAIN, COmmunication by GAze INteraction, an EU funded project with the goal of holistic development of interfaces to allow those with a range of disabilities to interact with VR environments on an equal footing. To do so, they are creating interfaces which often mimic as closely as possible, other modalities for interaction.

The software in this case is called gaming-with-gaze, and has been successfully tested in both Second Life and World of Warcraft. Created at De Montfort University in the UK, the initial drivers are being given away free, in order to increase uptake, and allow developers to experiment. No word as yet on whether future versions will be charged for.

For people suffering from conditions such as cerebral palsy, motor neurone disease (MND) or so-called locked-in syndromes, being able to move around and interact in a virtual environment is a "truly liberating experience," stated Howell Istance, one of the programmers who helped develop the software.

"Until now, gaze-tracking technology has mainly been used for typing with visual keyboards, for browsing the web and other text-based applications. We have taken it to an entirely new level by using eye movements to control an avatar in a virtual environment, allowing people with disabilities to appear and interact just like able-bodied people if they wish."

The last part is kind of a dumb statement. It is kinda a given that anyone with a significant disability longs for that to vanish, and for them to be on a level playing field with everyone else. That is one of the staple points of any VR.

Applying the Interface to a given VR

More work than might be immediately apparent is necessary in order to connect this gaze recognition system with any VR system at this current time. The software needs to know and understand where different elements of the interface are, on-screen, since it does not utilise a mouse pointer. It needs to be able to tell, thusly, when a user is looking at a part of the interface panels, and not at the VR environment itself.

Thus, for both Second Life, and World of Warcraft, a heat map was created, by determining from a range of users, where on screen they were looking, in order to access certain controls. This clunky system, whilst it does work, is at the mercy of any interface redesign in the VR. It does however, allow the mouse to be utilised in tandem to the gaze recognition approach, which opens up the possibility of additional interaction levels. An option to supplant the mouse entirely would definitely be an improvement, but as the software currently stands, the mouse is retained for safety, to allow recovery if the gaze recognition overlay stops working - as it is prone to do.

Additionally, because gaze is used to determine movement, the system is currently completely incompatible with head mounted display systems, and by extension, head tracking systems. One or the other, not both.

As an additional difficulty layer, sometimes the user may desire to move even though they are currently looking at a menu. In this case, the system utilises 'gaze gestures' - patterns of rapid eye movement which it understands as pre-set commands, in a similar way to macro triggers. These can be set by the user if so desired.

"In the current set up, we have programmed 12 gesture sequences to activate different keyboard or mouse events," Istance stated. "Many more commands are possible but the total number is limited by the users' memory and the need to differentiate between when someone wants to input a command and when they are just looking at the screen."

Snap Clutch software controlling World of Warcraft


On the hardware side, there are no surprises. Commercial, off the shelf webcams are utilised, in pairs. These create a stereoscopic view of the user's face, and allow each camera to track one eye. The software attempts to smooth out interaction, but there are some teething problems. Spectacles tend to confuse the results somewhat, and don't even think about shades.


The gaming-with-gaze software should make the avatars of people with disabilities almost indistinguishable in their behaviour and abilities from those of able-bodied people in online games and environments.

"It could be life changing for the large number of paralysed people whose only means of communicating is with their eyes. Second Life, for example, could really be a second life for them, providing not only entertainment but versatile electronic services, for example, education," says Aulikki Hyrskykari, a researcher at Tampere University.

"Obviously there will be limitations to what users with disabilities can do - such as not being able to perform several actions simultaneously - but they can choose what activities they participate in. That is a choice they did not have before," Istance adds.


COGAIN: Communication by Gaze Interaction

Gazetalk Info and Download

Snap Clutch Info and Download (Main Driver)

Staff Comments


Untitled Document .