Human Interaction with Personally Invasive Non-Lifelike Robots
Robots have great potential in the healthcare fields. From robotic porters, to robot-assisted surgery, to full robotic surgery, robotic administration staff, robot nurses, and home care helpers. Yet, one key question has remained unanswered for the longest time. We know we are still struggling with the uncanny valley; that deep deep in believability between 'cute' and human-like figures, that occurs when the artificial human looks just lifelike enough for the 'wrongness' to be obvious.
This 'wrongness' provokes fear and revulsion, for the exact same reason as we find corpses disturbing and zombies scary: They are close to living humans, but physical aspects are wrong. Its also the same reason many humans with learning impairments or stroke-damage are met with revulsion by many other humans. They move 'wrong'.
So, it has been accepted for some time that given our inability to cross the valley despite continued attempts, that for some time yet our industrial, commercial, and healthcare robots will not have a human-like appearance; focusing on other, more compact or efficient forms instead.
The question that remained unanswered is then: How will humans in general react to their physical manipulation or space invasion by clearly intelligent, yet alien machinery?
Will a robotic nurse or care worker that looks like a machine, be acceptable to a majority of individuals, or will their very mechanical appearance serve as a barrier to trust?
An initial study has been undertaken by researchers at the Georgia Institute of Technology. This study used a very small number of volunteers, some 56 in number, and a stable, self-supporting mobile manipulation robot named Cody, as the test robot.
Cody could not have been less human-like. It consisted of a segway robotic base unit, two Meka anthropomorphic, seven jointed robotics manipulation arms, with pieces of flannel towel cut to the shape of the end manipulators, and placed over them entirely, to give the impression of a warm skin of sorts. As should be obvious from this description, Cody was assembled by the researchers, specifically for this research.
Several different tests were carried out on each of the volunteers, and some surprising results soon became clear: The response to being touched by the robot was generally positive, but always coloured by the patient's perception and preconceptions of what the robot was doing. If they felt they were being cleaned by the robot for example, the response was positive. If they felt they were being examined by the robot, the response was significantly less so. In all instances, Cody actually touched the volunteer patient's skin in the exact same way. The preconception of the patient being the only difference.
Perhaps even more surprisingly, responses were typically more positive if the robot did not announce its intentions first: if it reached out and grasped the person without waiting for consent, or giving the person a chance to prepare.
These results echo similar studies done with nurses.
Since many useful tasks require that a robot touch a person, the team believes that future research should investigate ways to make robot touch more acceptable to people, especially in healthcare. Many important healthcare tasks, such as wound dressing and assisting with hygiene, would require a robotic nurse to touch the patient's body,
If we want robots to be successful in healthcare, were going to need to think about how do we make those robots communicate their intention and how do people interpret the intentions of the robot, added Kemp. And I think people havent been as focused on that until now. Primarily people have been focused on how can we make the robot safe, how can we make it do its task effectively. But thats not going to be enough if we actually want these robots out there helping people in the real world.
The research was presented at the 2012 Human-Robot Interaction conference in Boston, USA.