Home Artificial Intelligence Simulating discrimination in virtual reality

Simulating discrimination in virtual reality

0
Simulating discrimination in virtual reality

Have you ever ever been advised to “walk a mile in another person’s shoes?” Considering one other person’s perspective is usually a difficult endeavor — but recognizing our errors and biases is essential to constructing understanding across communities. By difficult our preconceptions, we confront prejudice, reminiscent of racism and xenophobia, and potentially develop a more inclusive perspective about others.

To help with perspective-taking, MIT researchers have developed “On the Plane,” a virtual reality role-playing game (VR RPG) that simulates discrimination. On this case, the sport portrays xenophobia directed against a Malaysian America woman, however the approach will be generalized. Situated on an airplane, players can tackle the role of characters from different backgrounds, engaging in dialogue with others while making in-game selections to a series of prompts. In turn, players’ decisions control the end result of a tense conversation between the characters about cultural differences.

As a VR RPG, “On the Plane” encourages players to tackle latest roles which may be outside of their personal experiences in the primary person, allowing them to confront in-group/out-group bias by incorporating latest perspectives into their understanding of various cultures. Players engage with three characters: Sarah, a first-generation Muslim American of Malaysian ancestry who wears a hijab; Marianne, a white woman from the Midwest with little exposure to other cultures and customs; or a flight attendant. Sarah represents the out group, Marianne is a member of the in group, and the flight staffer is a bystander witnessing an exchange between the 2 passengers.

“This project is a component of our efforts to harness the facility of virtual reality and artificial intelligence to handle social ills, reminiscent of discrimination and xenophobia,” says Caglar Yildirim, an MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) research scientist who’s a co-author and co-game designer on the project. “Through the exchange between the 2 passengers, players experience how one passenger’s xenophobia manifests itself and the way it affects the opposite passenger. The simulation engages players in critical reflection and seeks to foster empathy for the passenger who was ‘othered’ as a result of her outfit being not so ‘prototypical’ of what an American should appear like.”

Yildirim worked alongside the project’s principal investigator, D. Fox Harrell, MIT professor of digital media and AI at CSAIL, the Program in Comparative Media Studies/Writing (CMS), and the Institute for Data, Systems, and Society (IDSS) and founding director of the MIT Center for Advanced Virtuality. “It is just not possible for a simulation to provide someone the life experiences of one other person, but while you can’t ‘walk in another person’s shoes’ in that sense, a system like this will help people recognize and understand the social patterns at work on the subject of issue like bias,” says Harrell, who can be co-author and designer on this project. “An attractive, immersive, interactive narrative may impact people emotionally, opening the door for users’ perspectives to be transformed and broadened.” 

This simulation also utilizes an interactive narrative engine that creates several options for responses to in-game interactions based on a model of how individuals are categorized socially. The tool grants players a likelihood to change their standing within the simulation through their reply selections to every prompt, affecting their affinity toward the opposite two characters. For instance, for those who play because the flight attendant, you’ll be able to react to Marianne’s xenophobic expressions and attitudes toward Sarah, changing your affinities. The engine will then offer you a special set of narrative events based in your changes in standing with others.

To animate each avatar, “On the Plane” incorporates artificial intelligence knowledge representation techniques controlled by probabilistic finite state machines, a tool commonly utilized in machine learning systems for pattern recognition. With the assistance of those machines, characters’ body language and gestures are customizable: for those who play as Marianne, the sport will customize her mannerisms toward Sarah based on user inputs, impacting how comfortable she appears in front of a member of a perceived out group. Similarly, players can do the identical from Sarah or the flight attendant’s perspective.

In a 2018 paper based on work done in a collaboration between MIT CSAIL and the Qatar Computing Research Institute, Harrell and co-author Sercan Şengün advocated for virtual system designers to be more inclusive of Middle Eastern identities and customs. They claimed that if designers allowed users to customize virtual avatars more representative of their background, it’d empower players to have interaction in a more supportive experience. 4 years later, “On the Plane” accomplishes an analogous goal, incorporating a Muslim’s perspective into an immersive environment.

“Many virtual identity systems, reminiscent of avatars, accounts, profiles, and player characters, should not designed to serve the needs of individuals across diverse cultures. We now have used statistical and AI methods together with qualitative approaches to learn where the gaps are,” they note. “Our project helps engender perspective transformation so that individuals will treat one another with respect and enhanced understanding across diverse cultural avatar representations.”

Harrell and Yildirim’s work is a component of the MIT IDSS’s Initiative on Combatting Systemic Racism (ICSR). Harrell is on the initiative’s steering committee and is the leader of the newly forming Antiracism, Games, and Immersive Media vertical, who study behavior, cognition, social phenomena, and computational systems related to race and racism in video games and immersive experiences.

The researchers’ latest project is a component of the ICSR’s broader goal to launch and coordinate cross-disciplinary research that addresses racially discriminatory processes across American institutions. Using big data, members of the research initiative develop and employ computing tools that drive racial equity. Yildirim and Harrell accomplish this goal by depicting a frequent, problematic scenario that illustrates how bias creeps into our on a regular basis lives.

“In a post-9/11 world, Muslims often experience ethnic profiling in American airports. ‘On the Plane’ builds off of that style of in-group favoritism, a well-established finding in psychology,” says MIT Professor Fotini Christia, director of the Sociotechnical Systems Research Center (SSRC) and associate director or IDSS. “This game also takes a novel approach to analyzing hardwired bias by utilizing VR as a substitute of field experiments to simulate prejudice. Excitingly, this research demonstrates that VR will be used as a tool to assist us higher measure bias, combating systemic racism and other types of discrimination.”

“On the Plane” was developed on the Unity game engine using the XR Interaction Toolkit and Harrell’s Chimeria platform for authoring interactive narratives that involve social categorization. The sport can be deployed for research studies later this yr on each desktop computers and the standalone, wireless Meta Quest headsets. A paper on the work was presented in December on the 2022 IEEE International Conference on Artificial Intelligence and Virtual Reality.

LEAVE A REPLY

Please enter your comment!
Please enter your name here