Trust and Computing: Trust Higher when Machines Appear Human
With all the focus on the uncanny valley, and the difficulties involved in making artificial systems that already appear somewhat human, to appear more so, one area of research has been relatively neglected: How does interaction with machines differ between machines that look like machines, versus machines with human elements?
Researchers at Clemson University in the US have been studying the differences between human interaction with various types of computer interface linked to a diagnostic AI, to see how the different types of visual element included, affect the user's willingness to trust the artificial mind's judgement.
A study recently published (in the July 17 edition of the journal Ergonomics) by professor Richard Pak along with research students Nicole Fink, Margaux Price, Brock Bass and Lindsay Sturre, shows many of the results of their work to date. The work itself is ongoing.
The study is one of the very first to examine how the design of decision-support aids on devices can influence the level of trust that users place in that system and how much they use it. The worst-case scenario for a user of an intelligent minded device, is that the AI makes a correct decision in acritical situation, but the user discards the advice because of an inherent lack of trust in the way the device conveys information. Almost worse, is the possibility that trust is placed in an alternate, unreliable artificial data soucre simply because, by pure chance, it has a visual interface which engenders more trust.
Just as trust is an important factor in how humans deal with other humans, it also can determine how users interact with computerized systems," Pak said. "Trust can be influenced by the aids reliability and level of computerization as well as the users experience and age."
Many people interact with computerized decision aids or automation on a daily basis, whether theyre using smart phones, digital cameras or global positioning systems. When automation is only reliable sometimes, a persons level of trust becomes an important factor that determines how often the aid will be used.
"Figuring out how trust is affected by the design of computerized aids is important because we want people to trust and use only reliable aids," said Pak.
Paks research findings have revealed that the inclusion of an image of a person can significantly alter perceptions of a computerized aid when there is no difference in the aids reliability or presentation of information.
"Humanlike computer aids provide a reduced decision-making reaction time for adults," said Pak. "A plausible explanation is that the increase in trust led to an increased dependence on the aid, which led to faster performance."
Paks future research will examine the specific aspects of the aid that affect trust in different age groups and gender. He also is studying the affects of the aids on users when faced with decisions that have either a high consequence, such as making health decisions, or a low consequence, such as deciding what type of computer to buy.