Gesture control is a type of interface with the controlling computer system of a virtual environment that uses hand motions and changes in body posture such as stance to directly impart command sequences tin lieu of using a keyboard or mouse. Typically it might include behaviour similar to that of a mime, which gesture tracking systems read, analyse the positions of various body parts and how those positions change over time, then implement their interpretation of the desired action.
Below, we offer a selection of links from our resource databases which may match this term.
Related Dictionary Entries for Gesture Control:
Resources in our database matching the Term Gesture Control:
A potential concern has been found in the psychological implications of gesture control interfaces. When we design such things we must be aware of a power some gestures have, to alter the perceptions and memories of those who witness them.
MoCap - Motion Capture - for all its impressive abilities, has definite limitations in terms of sensory fidelity, the expense and bulk of the rig. Gesture control is cheap and captures every little movement, but easily overwhelmed. Is a hybrid system possible?
AirStrike is a gesture control system designed to enable control of any pc or display interface via waving your fingers in the air, and smart sensing technology to detect those gestures accurately enough to completely replace a computer mouse or 3D pointer.
It often seems that scarcely a week goes by without word of a minor or major breakthrough in gesture control of computer systems, speech recognition, or speech synthesis systems. All seem to be vectoring in on the ability to control computers entirely hands-off.
The stuff of numerous sci-fi films – the concept of with no more than a gesture, moving data wholesale from one computer to the next, has been made real, with a device not only functional in the lab, but already on its way to mass market commercial use, integrated in satellite and cable TV units.
A PDF released by Microsoft Research, entitled ?Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces? details the science behind utilising muscle control as an alternative, viable input system to gesture recognition or spoken commands.
PDF size: 1 meg
A larynx or voicebox is a complex piece of kit to control. Creating an interface for a fully functional artificial one may be beyond us - unless we jack it straight into the brain, and let the nervous system control the new, just the same as the old.
Here we have a remote control, used to control what was originally a human, now just a glorified appliance. If the idea of a remote being used to control a human seems preposterous, remember GVS.
Gaze tracking comes to the desktop, as an interface modality designed exclusively to control virtual environments by gaze control, and aimed at those who lack fine motor control for a mouse. Packaged in the unfortunately named "Snap Clutch" driver.
Rice University's Multi-Robot Systems Laboratory has created an online game designed solely to help further research into swarm control of simple robotic systems. Namely the quest to find truly effective control strategies for their own bespoke r-one swarm robotic system.
Industry News containing the Term Gesture Control:
Results by page 
A system that can recognize human gestures could provide a new way for people with physical disabilities to interact with computers. A related system for the able bodied could also be used to make virtual worlds more realistic. The system i...
Mgestyk Technologies has developed a gesture-based control system using nothing more than an "affordable 3D camera" and some custom software to capture even small hand gestures.