Not a member yet? Register for full benefits!

Username
Password
It's not just how the Avatar Looks, it's how the Avatar Talks that is Remembered

When it comes to creating NPCs with a specific task in mind: salespeople in VR-based shops for example, or official company representative AIs able to address potential clients and address questions about the products, then how they interact with the customer is important. Making the customer feel at ease can come in many forms, as this quote from the book “Snow Crash” illustrates very well:

 

Ng Security Industries has a whole floor of a mile-high neon skyscraper near Port One, right in the middle of Downtown. Like everything else in the Metaverse, it's open twenty-four hours, because it's always business hours somewhere in the world. Hiro leaves his bike on the Street, takes the elevator up to the 397th floor, and comes face to face with a receptionist daemon. For a moment, he can't peg her racial background; then he realizes that this daemon is half-black, half-Asian - just like him. If a white man had stepped off the elevator, she probably would have been a blonde. A Nipponese businessman would have come face to face with a perky Nipponese girl.

Source: Snow Crash, Page: 366

In that example from one of the perhaps most influential books behind modern day attempts to create a communal VR, we can see an example of tailoring the AI's avatar to the specifics of the customer. Age, gender, race, all matched to be as close as possible to what that individual might well find attractive, in an effort to make a better sales pitch.

Now that is all well and good, but it seems more is necessary than just appearance. How the AI speaks is important as well. Specifically, how much emotion is put into the words and when it is placed, may help to dictate what is remembered afterwards.

According to Annett Schirmer and colleagues from the National University of Singapore, emotion helps us recognize words quicker and more accurately straight away. In the longer term however, we do not remember emotionally intoned speech as accurately as neutral speech. When we do remember the words, they have acquired an emotional value; for example words spoken in a sad voice are remembered as more negative than words spoken in a neutral voice.

So care has to be taken when crafting a speech, or in dynamic speech, care has to be taken by the learning algorithm, to create a sales pitch or set of instructions which captivate the human partner with enough emotion to get them interested, but flat enough in the sections that really matter, to guarantee they will be remembered as opposed to simply coloured by the emotion.

In anger, sadness, exhilaration or fear, speech takes on an urgency that is lacking from its normal even-tempered form. It becomes louder or softer, more hurried or delayed, more melodic, erratic or monotonous. And this emotional speech immediately captures a listener's attention. Schirmer and colleagues' work focussed on the lasting effect emotion has on word memory.

A total of 48 men and 48 women listened to sadly and neutrally spoken words and were later shown these words in a visual test, examining word recognition and attitudes to these words. The authors also measured brain activity to look for evidence of vocal emotional coding.
Their analyses showed that participants recognized words better when they had previously heard them in the neutral tone compared with the sad tone. In addition, words were remembered more negatively if they had previously been heard in a sad voice.

In other words, its not just the timing of emotion that has to be watched for, its the need to avoid words that have already been given a strong emotional inflection from elsewhere in the conversation – or perhaps to reinforce a positive message by repeating ones with a happy emotional investment, often.

A human salesperson or lecturer might well struggle to remember all the elements involved, but an expert system derived from a weak AI, well that is an entirely different matter. It becomes possible to create a much more lasting pitch by careful use of emotional undertones in speech – and likely in body postures (movement seqs) as well, to reinforce what the voice is saying.

Words were remembered more negatively if they had previously been heard in a sad voice.

The researchers also looked at gender differences in word processing. They found that women were more sensitive to the emotional elements than men, and were more likely than men to recall the emotion of the speaker's voice. Current levels of the female sex hormone estrogen predicted these differences.
Schirmer and team conclude: "Emotional voices produce changes in long-term memory, as well as capturing the listener's attention. They influence how easily spoken words are later recognized and what emotions are assigned to them. Thus voices, like other emotional signals, affect listeners beyond the immediate present."

References

Words have feelings

Vocal emotions influence verbal memory: neural correlates and inter-individual differences (paper)

Local: Snow Crash Quotations

Local: Snow Crash (review)

Staff Comments

 


.
Untitled Document .