U-M research reveals racism challenges in human-computer interaction

June 11, 2020
Written By:
Laurel Thomas
Contact:
  • umichnews@umich.edu

Human computer interaction concept image. Image credit: iStockFACULTY Q&A

Recent events have brought to the forefront issues of racial disparities in many sectors of society. One area that impacts much of life inequitably—education, work, commerce, safety and security, and leisure pursuits, among others—is our increasing use of computers.

Researchers at the University of Michigan, Northwestern University and Carnegie Mellon University recently completed a unique study on the state of human-computer interaction, a growing field overall—both as a career and in everyday use of electronic devices—but not among minority groups.

Their study, which was honored as a Best Paper at the annual The ACM Conference on Human Factors in Computing Systems, uses personal narratives to get to the bottom of some of the experiences of nonwhite participants in the field. The authors identify as African American, Asian, mixed-race and multicultural backgrounds.

Ihudiya Finda Ogbonnaya-Ogburu and Alexandra To, first co-authors of the study, explain their research. Ogbonnaya-Ogburu is a doctoral student at the U-M School of Information and To is a doctoral candidate in human-computer interaction at Carnegie Mellon.

Your paper is about critical race theory for human-computer interaction. Let’s first start by defining the terms critical race theory and human-computer interaction.

Ihudiya Finda Ogbonnaya-Ogburu

Ihudiya Finda Ogbonnaya-Ogburu

Ogbonnaya-Ogburu: Critical race theory is a theoretical framework developed in the 1970s. It was created to highlight the colorblind approaches used to examine civil rights law cases, to call attention to the inherent racism in the policies and laws that impacted minority US citizens, and to highlight individual minoritized voices. What we do is apply that framework for a research community that considers how people interact with computers, which is what we mean by human-computer interaction. We study these interactions to inform how we create technology.

You say in the paper that recent events (as of your writing), Black Lives Matter, the white supremacist rally in Virginia, fake news, election meddling—and presumably more recent George Floyd protests—have made issues of race front and center but that the issue in HCI is ongoing despite some efforts to overcome. Please explain.

Ogbonnaya-Ogburu: One tenet of critical race theory that we adapt in our paper is that racism is ordinary. Those who do not experience racism may think that it is an aberration, as if what happened to George Floyd were an anomaly. But, racism is present in every online search, every social media post, every Amazon purchase and every mass-market technology because the groups with power over these systems are often insufficiently inclusive. For example, if you do a Google image search for “family holiday” you get a bunch of white-presenting people at the beach. People of color often have to make do with these images, or think to insert their own identities in the keywords to find images of individuals who like themselves, thus reinforcing any self consciousness with respect to race.

You chose a storytelling mode to collect your research. Why?

Alexandra To

Alexandra To

To: “Counterstories” are narratives that come from people in the margins. Scientific discourse often emphasizes the dominant narrative of the majority (for example, discussions of sample size, generalization, edge cases). Through storytelling, we center the real experiences of BIPOC (black, indigenous, people of color).

If possible, can you summarize the stories you heard?

Ogbonnaya-Ogburu: The stories mentioned in the article are our personal stories of how racism has impacted us personally, and in the communities we partner with to perform our research. There are a total of nine stories in the paper, and they’ve been edited down, so it’s hard to summarize them further, but here’s one:

Filter bubbles is a phenomenon where someone on the internet can encounter information and opinions that reinforce their own beliefs based on their search history, location and click behavior. The reality is due to this technical innovation that there are members in this society who are being informed that the police are just, and may never encounter the blatant acts of racism that have been witnessed by millions of Americans, by George Floyd, Breonna Taylor, Ahmaud Arbery, Tony McDade, Trayvon Martin, Eric Garner and Sandra Bland, simply because of their search history.

It strikes me that some of the stories are not unique to HCI. For example, editing away voices of color would seem to be an issue across the academy. Would you explain this story briefly and talk about whether it is a bigger problem than one discipline? Are there other stories that represent issues across disciplines?

To: Though we focus on HCI, racism is ubiquitous. Many of our points are applicable not only across academia but to society as a whole (for example, editing and removing voices of color has become a publicly known issue in journalism in Pittsburgh this week). We talk about, in our paper, being uncomfortable editing quotes under the racist guise of being readable and grammatically correct. Our participants using vernacular English was often an indicator that trust or comfort was established with research participants. Even in computing, scholars have increasingly come to recognize that we can’t separate technology from history and prevailing systems of oppression (see Ruha Benjamin, Safiya Noble).

There are many discussions going on this past week about privilege, failure to embrace difference, failure to bring stories to light, etc. And aren’t those similar to some of the stories you found as well?

Ogbonnaya-Ogburu: Our stories are specific to our contexts, but the underlying issues are everywhere. In our paper, we highlight how we have experienced racism in courses, in research, among our research participants, with colleagues, in corporations, in academia and everywhere else. Our focus was to expose, acknowledge and undo racism in our sphere of influence. As researchers, we are not absolved of racism, and our paper challenges our research community (including ourselves) to fight against racism within ourselves, our research and the sociotechnical world.

What has to happen to make HCI more inclusive?

Ogbonnaya-Ogburu: In our paper, we listed a number of things that the HCI community can do to improve matters, but an important aspect of critical race theory is valuing the voices of marginalized members in your community. Honest and direct conversations require strong relationships—trust and the removal of the fear for retaliation. The HCI community can start by taking the time to listen, act and regularly evaluate their anti-racist initiatives. However, communities don’t exist in a vacuum—they inherit qualities from the larger society. We must all be committed to nothing short of the global elimination of racism.

A third first author of the study is Angela D. R. Smith, doctoral candidate in technology and social behavior at Northwestern. Senior author is Kentaro Toyama, the W. K. Kellogg Professor of Community Information at the U-M School of Information.

 

More information: