Skip to main content
Markkula Center for Applied Ethics

Can Technology Help Us be More Empathetic? Racism, Empathy and Virtual Reality

photo of boy holding a vr headset

photo of boy holding a vr headset

Erick Jose Ramirez

Julie M. Cameron/Pexels.com

Erick Jose Ramirez is an associate professor with the Department of Philosophy at Santa Clara University. Views are his own.

In the wake of global protests over anti-Black police violence, many are finally asking themselves how to combat systemic, institutionalized injustice. As is often the case in the Bay Area and beyond, some are hoping that technology can help us overcome one of the sources of intersectional (racialized, gendered, classed) injustice: lack of empathy. Because virtual reality (VR) technologies are so immersive and so successful at creating a sense of “being” in virtual spaces, VR has become the focus of these techno-solutions.

It’s perhaps not surprising that VR has received so much attention as an empathy enhancer. The creators of “1,000 Cut Journey” describe it as a VR experience in which “the viewer becomes Michael Sterling, a black man, encountering racism as a young child, adolescent, and young adult.” Speaking of his own VR installation, “Carne y Arena” (which simulates the experience of migrating without documents into the U.S.), director Alejandro Iñárritu has said his “intention was to experiment with VR technology to explore the human condition in an attempt to break the dictatorship of the frame, within which things are just observed, and claim the space to allow the visitor to go through a direct experience walking in the immigrants’ feet, under their skin, and into their hearts.” Creators of similar VR simulations promise to give their users the experiences of being Palestinian, or homeless, pregnant, in a wheelchair, or autistic. One even claims to help users know what it's like to be a cow raised for slaughter. We should be skeptical of any simulation that makes such promises and worry about the intersectional implications of these simulations.

The idea is that technology might help us better understand what it’s like to be someone on the receiving end of racist violence (or gendered violence or ableist and classist discrimination, etc.) and that better understanding what it’s like to experience someone else’s marginalization can help us understand the roots of our own racism and then combat it. Unfortunately, such approaches rely on seriously problematic assumptions about what it means to experience racism (or misogyny or classism or ableism) and often perpetrate the very racism they’re trying to help stop. To see why, we need to understand what intersectionality means and a little more about the psychology of experience.

Intersectionality is a complex concept. The term was first coined by Kimberlé Crenshaw (1989, 1991) to help clarify how different forms of oppression interact. Black women, she argued, often encounter a kind of discrimination that can’t be reduced to racial or gender discrimination but instead intersect in unique ways. Today, the concept of intersectionality is also used to talk about how race, gender, class, orientation, nationality, and so on affect how people experience the world. Philosopher Sara Bernstein puts the point this way:

Intersectionality ... refers to a type or token of experience faced by members of [intersecting identity] categories, as in experiences had by black women that are not entirely explicable by appeal to being black or to being a woman. (Bernstein 2020, 322)

Psychological research suggests that intersectional elements of identity even affect subconscious perceptions of others and their pain (Avennanti et al. 2010). To see why intersectionality poses a problem for empathy simulation we need to understand what these simulations are doing. These simulations claim to give you the experience of what it’s like to be a Black man, experience homelessness, etc. However, if how I experience the world is (literally) shaped by my racial, ethnic, and gender identities, and by other facts about my upbringing, then no VR simulation can give that experience to someone else. Going through “1,000 Cut Journey,” am I any closer to experiencing what it’s like for its main character, Michael Sterling, to experience racist aggression? No. At best, I might have a sense of what it might be like if people began to treat me as if I were someone named Michael but that’s a very different experience than Michael would have (and it gets more different the more Michael and I are less like each other). This isn’t empathy. Worse, depending on the viewer, “1,000 Cut Journey” might be met with anger, sadness, even pleasure (imagine a committed white-nationalist’s response).

It’s dangerous to assume that VR can make us more empathetic, that it can give us the experience of what it’s like to be someone else. To assume that VR can do this denies that intersectionality is a fundamental part of how we experience the world. In denying this, we make it more difficult to get at the roots of systematic injustice by painting a false picture of other people’s experiences. A simulation that puts me in the position of a Palestinian wouldn’t get me closer to understanding the experience of living in Palestianian territories. From my position of privilege, it may be impossible to understand the first-person point of view of someone subject to Israeli occupation. To assume that what it means to be Black or Palestinian (or pregnant, or a cow!) is given simply by the point of view of a camera located in space erases the role identity plays in how we understand and see the world. This kind of understanding also isn’t necessary for someone to care about the institutionalized harms that marginalized peoples experience. We don’t need to empathize with someone’s experience to know that they’ve been wronged. If VR can’t make us more empathetic, which it can’t, is there any way we can harness its powers for social justice? Yes, but we need to design simulations for sympathy not empathy.

Michael Goldman, director of the U.S. Holocaust Museum, uses VR technology to help viewers learn about the Holocaust. Unsurprisingly, Goldman has run into the same problem I’ve been discussing here. His solution is a useful model for how VR designers should ethically design their social justice simulations:

Goldman ... has discussed two issues that have come from displaying VR in the Holocaust Museum. Either the visitor minimizes their own experiences, where they think they should not feel bad for themselves, say, because a friend died of cancer, because a Holocaust victim experienced something worse. Or, the visitor over-empathizes with a Holocaust survivor, where they think they know how it feels to be in the Holocaust. To combat these two scenarios Goldman treats visitors as “engaged witnesses” where they recognize the trauma of others without taking that trauma upon themselves. (Thatcher 2019) 

VR can’t show us what it’s like to be someone we’re not. When simulations promise this, they often work against intersectional justice by implicitly rejecting the role that intersectionality plays on experience. Like Goldman, we should use VR in ways that take advantage of what it can actually do: generate sympathy. How do we do that? Don’t promise to put me in the shoes of someone else experiencing injustice. Put me in a position where I am myself but also an engaged witness. A George Floyd simulator would be unethical for many reasons but a VR simulation that puts me at the scene of similar violence, where the simulation is structured to make me feel for the victimized, could be a powerful, and ethical, tool for correcting intersectional injustice.

References

April 20, 2018. ‘1000 Cut Journey’ Launches at Tribeca Film Festival. Retrieved from https://brown.columbia.edu/1000-cut-journey-launches-at-tribeca-film-festival/

Alejandro G. Iñárritu: CARNE y ARENA (Virtually present, Physically invisible). (2017). Retrieved https://www.lacma.org/art/exhibition/alejandro-g-inarritu-carne-y-arena-virtually-present-physicallyinvisible 

Avenanti A., Sirigu A., and Aglioti S.M. (2010). Racial bias reduces empathic sensorimotor resonance with other-race pain. Current Biology, 20 (11), 1018-1022

Crenshaw, K. (1989). Demarginalizing the intersecetion of race and sex: A black feminist critique of antidiscrimination doctrine, feminist theory, and antiracist policies. University of Chicago Legal Form 139-167

Crenshaw, K. (1991). Mapping the margins: Intersectionality, identity politics, and violence against women of color. Stanford Law Review , 43.6, 1241-1299

Grimsley-Vaz, E. (2018). Creator of ‘1000 Cut Journey’ uses VR to help white liberals understand racism. Moguldom.com . Retrieved from https://moguldom.com/152786/creator-of-1000-cut-journey-uses-vr-to-help-white-liberals-understand-racism/

Ogle, E., Asher, T., & Bailenson, J. (2018). Becoming Homeless: A Human Experience . Virtual Human Interaction Laboratory. Retrieved from http://vhil.stanford.edu/becominghomeless/

Ramirez, E. (2017). Empathy and the limits of thought experiments. Metaphilosophy , 48 (4), 504-526 Ramirez, E. (2018). Ecological and ethical issues in virtual reality research: A call for increased scrutiny. Philosophical Psychology , 32 (2), 211-233

Thatcher, S. (2019). VR and the role it plays in museums. Retrieved from: https://ad-hoc-museum-collective.github.io/GWU-museum-digital-practice-2019/essays/essay-9/

Jul 15, 2020
--