Untitled Document
Not a member yet? Register for full benefits!

Username
Password
 Robots Get a Feel for the World: Touch More Sensitve Than a Human's

This story is from the category Sensors
Printer Friendly Version
Email to a Friend (currently Down)

 

 

Date posted: 19/06/2012

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel -- or at least the ability to identify different materials by touch.

Researchers at the University of Southern California's Viterbi School of Engineering published a study June 18 in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group's BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for solving this general problem as "Bayesian Exploration."

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.

See the full Story via external site: www.sciencedaily.com



Most recent stories in this category (Sensors):

28/02/2017: DJI drones use plane avoidance tech

19/02/2017: Ford developing pothole alert system for drivers

08/02/2017: Pioneering chip extends sensors’ battery life

04/02/2017: Sensor Networks for Rangeland Animals

04/02/2017: Cardiff Uni bid to create osteoarthritis 'smart patch'

31/01/2017: Efficient time synchronization of sensor networks by means of time series analysis

12/01/2017: Uber to share data to help ease city congestion

23/12/2016: Electronic 'hairy skin' could give robots a more human sense of touch