Using VR to change behaviour - Beating the Bullies
We have long known that experiences from believable VR environments count just as fully as physical life experiences when it comes to learning from them and integrating that knowledge into the brain. Numerous other studies have examined the practice for everything from racial discrimination to operations training.
A three year running study by a group of European educators, psychologists and IT specialists has started publishing the results of more such studies; these looking at the problems of bullying and stereotyping that so commonly develop in school environments.
By creating virtual environments to model the sorts of situations, the researchers reasoned that the most difficult and potentially emotionally distraught situations could be recreated within safe limits, carefully controlled and examined.
The project was termed eCIRCUS, or Education through Characters with emotional-Intelligence and Role-playing Capabilities that Understand Social interaction, which is understandably a bit of a mouthful. It revolves around needs based AI and immersive VR, with the young person immersed in a virtual environment that has NPC-controlled avatars that to all outward appearances behave as much like their social peers could, as possible.
Two splinter programs have recently been announced, drawing on the successes of eCIRCUS, and furthering the research, which in its own way is a testament to the program's success. The splinter programs are FearNot! and ORIENT. Both attempt to place the individual in a defined role within a social VR, and intend to grasp the emotions of that person tightly as they interact with the characters; helping them modify their own behaviour. Of course, the same technology would be a godsend to many other fields as well.
The eCIRCUS researchers first focused on primary school children who were the victims of bullying. They drew on recent psychological theories that highlight the importance of feelings for changing how individuals interact with and treat one another. By hooking the individual's emotions and playing them like an instrument, the virtual environments reinforce learning behaviour that moves away from the will to bully another, and towards empathy for a given situation.
This is then used in scenarios which see the young people aiding victims bullying within the virtual world, running through different ideas to see which work, and which produce the best emotional results for them, as they learn.
The FearNot program, is a subdivision of this work, and is aimed at younger individuals at infant and junior school level (younger than 11 years). The name stands for the complex mouthful Fun with Empathic Agents to Achieve Novel Outcomes in Teaching. It is essentially a collection of needs based AIs which interact on a limited level with the participants - taking advantage of the young age to cover the gaps in current AI research. Each virtual child has a rudimentary brain, an emotional mindset, and the ability to remember - and to a limited extent learn from - previous decision paths.
This means that if the child suggests a help path the victim has been down before, that victim remembers the outcome of that path, and their emotional state reflects this - all the emotional state ups and downs. This then contributes to their response to that plan, rather than a simple "you've tried this option before, move on" approach.
Thus no pre-scripting is present in Fear Not! This makes it almost unique in complex VRs. Instead, each AI agent adapts on its own merits to each option choice, and the same options chosen for the same AI on two different run-throughs may have very different end results. This adds considerable realism and believability to the process, which again only serves to hook the participant's emotional state more tightly.
To test the effectiveness of FearNot!, the team tried it out with close to 1000 students in 30 primary schools across Germany and the UK. The researchers tested FearNot! by comparing a group of users and a control group of non-users, similar to the method used for testing medical treatments.
With just three half hour sessions over the course of three weeks, even with the rudimentary level of the emotional AIs used, the results were encouraging, with marked improvements across the participatory students compared to the control groups.
The software was originally developed as part of eCIRCUS, and is now making its own way, as a separate funding project.
ORIENT, is the older sibling to FearNot! It is much more of a true VR than its cousin, because it is aimed at a more mature audience - teenagers. It centres around the planet Orient where the students arrive. Also unlike FearNot! Orient is multi-user. Students are sent in, in groups of threes, to examine the local culture, and discover why it is tearing itself apart with different ethnic groups bullying and oppressing one another.
To give a level of abstraction away from human society, Orient is populated by aliens called Sprytes, who look rather like large bipedal tree frogs and who have their own language and customs. The initial dialogue is scripted, as it introduces the students to the Sprytes, then as with FearNot, the sprites' own needs based emotional AI takes over. The scripting is dropped, and the students are free to interact with the Sprites as peers. Admittedly none of them have minds as complex as a human brain, but still, it is remarkable the amount of realism they project, just by being completely unscripted, independent entities.
This independence is what makes Orient so complicated - well, that and the sheer scale of the world. The whole idea, as project co-ordinator Ruth Aylett put it, "We wanted users to feel adrift in this alien culture. How can you empathise with new people in your own culture if you've never experienced being adrift yourself?"
So, they deliberately to cast the students adrift in a completely foreign world, with no ties or preconceptions, but at the same time, emotionally bonding to the characters they meet along the way.
The interaction between the Sprytes and the students produces an unpredictable "emergent narrative". "There's no fixed plot," Aylett said. "Our characters are acting autonomously, making up their minds as they go." According to Aylett, students standing in front of a large screen and interacting with these psychologically believable aliens soon respond as if they were real.
This project has opened up a whole can of worms for all VRs, by employing intelligent, unscripted artificial intelligence, and using that to bond with the minds of the participants and form true emotional links. This opens up a level of immersion we have not seen before.
Obviously the current constructs are crude, and we have a long way to go before they result in AIs that are as believable as other humans, but it is a start. As a very nice gesture, eCIRCUS, now that the initial funding cycle is over, has released the complete AI source code, open source via sourceforge, for other efforts to experiment with. Creative commons licensing is in effect.