Not a member yet? Register for full benefits!

VR Interfaces: Mahru-M


Overview of Mahru-M
The Mahru-M service robot was developed by KIST - the Korean Institute of Science and Technology - in 2008. Designed as what was then the latest in a long line of service robots, Mahru had more than a passing resemblance to the robot maid in the 1980s cartoon series 'the Jetsons'. The name itself has a similar root to the difference between 'android', and 'gynoid'. In this case, Mahru refers to a male robot, and Ahra to a female robot. KIST use these definitions in each robot they create. Different generations are differentiated between by the letters after the Mahru or Ahra prefix.

Of course that means there were many models before Mahru-M. Unfortunately, we lack detailed information on them. Both robot groups take a different approach to AI than most attempts. These are what KIST refers to as 'network based humanoids'. This means that most of the actual processing does not occur in the robot body, but on external computer systems, networked together with the robot receiving a continual wireless data feed.

This novel approach means that the robots can be controlled by remote servers, and one-day, directly by humans, located elsewhere through the net.

Mahru-M was never designed to be a truly functional home help robot, but rather a steppi8ng stone towards the creation of one, which the research team hoped to achieve by 2018. Mahru moved at a glacial pace, but possessed then-revolutionary machine vision skills, that were capable of identifying items in three dimensional space through use of stereoscopic vision, the same as humans.

Mahru waddled forwards with a relatively natural gait, and was the size of a child, a common technique for making the robot servant seem less intimidating. The natural joint movements extended to all the joints of its body, as it was even possible for the robot to dance without falling over. At the time this was again, extremely impressive.

Lacking any real facial expression ability, Mahru-M used an odour release system to identify its 'mood', helping owners with an olfactory sense, determine if something was wrong. Perhaps not the best idea from a human interaction standpoint; but a good one from a technical point of view.

Additional interaction was provided by the robot's vision capabilities. As it was possible to detect movement in 3D space, it was a natural progression for the team to develop the robot's gesture reading capabilities. This research has since been funnelled into other projects across the globe. Additionally, speech recognition work was refined in the Mahru project; starting with Mahru-M and continuing into its descendants. This has provided researchers with an invaluable tool to assist voice recognition in an actual noisy environment a robot would encounter 'in the field', or an average household kitchen.

The main problem Mahru-M does have, that limits it solely to the lab, is its unnaturally slow movement system. Every action is slow, careful and painstaking. So slow in fact, that you could walk over to the counter, if you and the robot were just 5 metres away, make a sandwich, tidy up after you and turn away before the robot had reached the counter. Making the sandwich is also somewhat beyond it's capabilities.

Untitled Document