How do we keep the virtual world from being infected with real world biases?

op1demarle04-02102017-0001
Student developers from Champlain College showcasing Spacebox, an alternative-controller game, last year in San Francisco.

Advances in artificial intelligence, robotics, and augmented and virtual reality are bringing us into fabricated worlds that seem real.

Lowe’s “Holoroom” allows customers to don headsets and learn the skills to install dishwashers or complete other challenging home-improvement projects.

PokemonGO, powered by cloud computing and Google maps, enables players to interact with other players and visit local business. Within three months, this free game produced $600 million in revenue through in-game purchases.

This can seem right out of Star Trek‘s “Holodeck,” but daily we interact with technology, our feet firmly standing on the ground while our fingers lead our minds into the computer-generated worlds of social media, video games, robotics, and now virtual and augmented reality.

As our lives become more deeply dependent on smart devices such as phones, watches, and home products like Alexa, it is important to recognize that the software that drives them is written by humans and often comes with unconscious biases into the designs.

What is embedded in the software can influence our perceptions at a barely noticeable level. How can we be assured that we can distinguish between the actual and the virtual, fact from fiction?

In the early release of PokemonGO, researchers found that more stops and virtual characters were to be discovered in white neighborhoods than in ethnic neighborhoods. As we increasingly use products such as Apple’s Siri, Microsoft’s Cortana, and Google’s Alexa, do we ever question that these digital servants are all female personas, ready to wait on us, while supercomputers that do deep computation — such as Hal in the film 2001: A Space Odyssey or IBM’s Watson — are named for men?

Even computers are having a difficult time determining bias. Researchers from the University of Bath in the United Kingdom and Princeton University recently reported in Science that computers using artificial intelligence are acquiring racial and gender bias.

Google Translate, which depends on an AI system to provide more natural translations, has been found to attach male pronouns to words such as doctor, and female pronouns for nurse.

If it is difficult for a computer to distinguish implicit bias, how much more difficult is it for developers?

Fortunately, there are initiatives moving this question forward. The IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems aims to set standards, certifications, and codes of conduct “for ethical implementation of intelligent technologies.” Harvard’s Kennedy School of Government, with the Future Society, is working to answer the questions caused by technology and that pose policy choices.

Still, as consumers, we need to be aware on a daily basis of how the virtual affects our perceptions and decisions. As an educator, I believe it is up to education. Our students must be given the keys to drive the future. They will depend on, design, and build these new technologies, so they will have to examine the impact of what they use and create, and solve problems as they go.

At the Champlain College Emergent Media Center, we give students the tools to ask and answer questions about technology and its impacts. One group is designing an online literacy course that automatically tracks student progress and motivates learners. Another team is creating virtual-reality experiences that enable participants to discover clues and to question context and their perceptions.

If done well, virtual reality can actually combat racial and gender bias. A University of Barcelona study showed that participants who embodied a virtual person of a different race displayed reduced racial bias after exposure to virtual reality.

For example, we know that bullying is an international problem. We know that bias drives both the behavior of the bully and the attitudes toward the bully. My students have developed Breakaway, a game proved to change attitudes toward bullying, and are crafting it as a mobile game for broader reach. The game tackles violence against women and girls through a fun and interactive soccer game. It models behavior and encourages change by allowing players to think critically about personal biases and the actions they drive, and how to understand the consequences of bias.

Ann DeMarle is the director of the Emergent Media Center and associate dean of emergent media in the Division of Communication and Creative Media at Champlain College in Burlington, Vt. demarle@champlain.edu