Not a member yet? Register for full benefits!

Trust and Computing: Trust Higher when Machines Appear Human

With all the focus on the uncanny valley, and the difficulties involved in making artificial systems that already appear somewhat human, to appear more so, one area of research has been relatively neglected: How does interaction with machines differ between machines that look like machines, versus machines with human elements?

Researchers at Clemson University in the US have been studying the differences between human interaction with various types of computer interface linked to a diagnostic AI, to see how the different types of visual element included, affect the user's willingness to trust the artificial mind's judgement.

A medical test paper used for the study, along with a smartphone loaded with a diagnostic AI. It analyses the questyions and determines an answer. This one used a mechanic interface accentuated by an animation of spinning gears and cogs. Trust in the machine's judgement was quite low.


A study recently published (in the July 17 edition of the journal Ergonomics) by professor Richard Pak along with research students Nicole Fink, Margaux Price, Brock Bass and Lindsay Sturre, shows many of the results of their work to date. The work itself is ongoing.

The study is one of the very first to examine how the design of decision-support aids on devices can influence the level of trust that users place in that system and how much they use it. The worst-case scenario for a user of an intelligent minded device, is that the AI makes a correct decision in acritical situation, but the user discards the advice because of an inherent lack of trust in the way the device conveys information. Almost worse, is the possibility that trust is placed in an alternate, unreliable artificial data soucre simply because, by pure chance, it has a visual interface which engenders more trust.

“Just as trust is an important factor in how humans deal with other humans, it also can determine how users interact with computerized systems," Pak said. "Trust can be influenced by the aid’s reliability and level of computerization as well as the user’s experience and age."

Many people interact with computerized decision aids or automation on a daily basis, whether they’re using smart phones, digital cameras or global positioning systems. When automation is only reliable sometimes, a person’s level of trust becomes an important factor that determines how often the aid will be used.

"Figuring out how trust is affected by the design of computerized aids is important because we want people to trust and use only reliable aids," said Pak.

Pak’s research findings have revealed that the inclusion of an image of a person can significantly alter perceptions of a computerized aid when there is no difference in the aid’s reliability or presentation of information.

"Humanlike computer aids provide a reduced decision-making reaction time for adults," said Pak. "A plausible explanation is that the increase in trust led to an increased dependence on the aid, which led to faster performance."

Pak’s future research will examine the specific aspects of the aid that affect trust in different age groups and gender. He also is studying the affects of the aids on users when faced with decisions that have either a high consequence, such as making health decisions, or a low consequence, such as deciding what type of computer to buy.


the same medical test paper as before, and the same AI loaded into the smartphone. The only difference is a series of stills of an actress in medical garb wearing different expressions was used to convey data. Her appearance with the software substantially increased trust in its reliability. This was likely because users subconsciously associated her avatar with the advice being given.


Clemson researcher: humanizing computer aids affects trust, dependence

Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults (Paper)

Staff Comments


Untitled Document .