It's not just how the Avatar Looks, it's how the Avatar Talks that is Remembered
When it comes to creating NPCs with a specific task in mind: salespeople in VR-based shops for example, or official company representative AIs able to address potential clients and address questions about the products, then how they interact with the customer is important. Making the customer feel at ease can come in many forms, as this quote from the book “Snow Crash” illustrates very well:
Source: Snow Crash, Page: 366
In that example from one of the perhaps most influential books behind modern day attempts to create a communal VR, we can see an example of tailoring the AI's avatar to the specifics of the customer. Age, gender, race, all matched to be as close as possible to what that individual might well find attractive, in an effort to make a better sales pitch.
Now that is all well and good, but it seems more is necessary than just appearance. How the AI speaks is important as well. Specifically, how much emotion is put into the words and when it is placed, may help to dictate what is remembered afterwards.
According to Annett Schirmer and colleagues from the National University of Singapore, emotion helps us recognize words quicker and more accurately straight away. In the longer term however, we do not remember emotionally intoned speech as accurately as neutral speech. When we do remember the words, they have acquired an emotional value; for example words spoken in a sad voice are remembered as more negative than words spoken in a neutral voice.
So care has to be taken when crafting a speech, or in dynamic speech, care has to be taken by the learning algorithm, to create a sales pitch or set of instructions which captivate the human partner with enough emotion to get them interested, but flat enough in the sections that really matter, to guarantee they will be remembered as opposed to simply coloured by the emotion.
In anger, sadness, exhilaration or fear, speech takes on an urgency that is lacking from its normal even-tempered form. It becomes louder or softer, more hurried or delayed, more melodic, erratic or monotonous. And this emotional speech immediately captures a listener's attention. Schirmer and colleagues' work focussed on the lasting effect emotion has on word memory.
A total of 48 men and 48 women listened to sadly and neutrally spoken words
and were later shown these words in a visual test, examining word recognition
and attitudes to these words. The authors also measured brain activity to look
for evidence of vocal emotional coding.
In other words, its not just the timing of emotion that has to be watched for, its the need to avoid words that have already been given a strong emotional inflection from elsewhere in the conversation – or perhaps to reinforce a positive message by repeating ones with a happy emotional investment, often.
A human salesperson or lecturer might well struggle to remember all the elements involved, but an expert system derived from a weak AI, well that is an entirely different matter. It becomes possible to create a much more lasting pitch by careful use of emotional undertones in speech – and likely in body postures (movement seqs) as well, to reinforce what the voice is saying.
The researchers also looked at gender differences in word processing. They
found that women were more sensitive to the emotional elements than men, and
were more likely than men to recall the emotion of the speaker's voice. Current
levels of the female sex hormone estrogen predicted these differences.
Local: Snow Crash Quotations
Local: Snow Crash (review)