Swift MoCap Sequence Retrieval
In an ideal world, all MoCap in a virtual environment would be garnered in real-time from the physical body of that person, or failing that, from the mind's intent of that person - in cases where the physical body is less than up to the task. For example, a person whose legs don't work, dancing. The physical body cannot, so take the mind's intent.
That day will come, and fairly soon. However, until then, most motion capture sequences are premade, and exist in huge databanks, sometimes tens of thousands in size, waiting to be called forth and used on an avatar.
The problem is, even with the most organized filing system, with 10,000 pre-scripted movement files, finding and calling forth the right one at any given moment is hardly easy, nor particularly realistic when there may be 20 second gaps between finding one, then finding the next.
Now, a research effort has yielded a new way to summon premade MoCap, which could dramatically cut times, and help ensure the right file is chosen.
Led by principal investigator Dr Sally Jane Norman, Director of Culture Lab, Newcastle University, researchers have come up with a prototype data retrieval tool which is based off of a quick sketch of the required movement with a mouse or stylus, illustrating on a visual copy of the avatar, in quick strokes which limbs to move and how. The system then analyzes the direction of limb movements, and searches for the MoCap sequence file it has that most closely matches those movements. To change to the next one, a few more flicks with the stylus, and the system is off again.
A secondary use for such a system, would be to use MoCap in the interactive way, but instead of performing the MoCap in real-time, using the system to indicate desired movements. For example, waggle your hips for a sexy walk, and the system fetches and begins looping a premade sexy walk sequence, mapped onto your avatar.
Details of the research are being published online in the Royal Society journal Philosophical Transactions of the Royal Society A, Monday, June 1st, 2009.