Publicly shaming harassers may be popular but it doesn’t bring justice

March 25, 2020
Contact: Laurel Thomas ltgnagey@umich.edu

Image signifying online justice of gavel hitting the table. Image credit: iStockANN ARBOR—You’ve been harassed online and the social platform has removed the offending post, so that’s it, right? End of story.

University of Michigan researchers say, “Not so fast.”

Sarita Schoenebeck, associate professor at the U-M School of Information, says current means of dealing with online harassment mirror the criminal justice system by penalizing the perpetrator, but they don’t address justice and fairness for the person who has been wronged.

Sarita Shoenebeck

Sarita Shoenebeck

In a survey of 573 U.S. social media users published in the journal New Media & Society, Schoenebeck and colleagues sought to understand what people who are harassed would want social media sites to do to better support them. They found that while most participants wanted some sort of action that felt just or fair, a one-size-fits-all approach will not work.

For example, many participants liked the idea of social media sites requiring a public apology from the harasser. Such an apology could acknowledge harms to the person being harassed. It could also provide a public statement that the site thinks the harassment is not okay.

“We regularly expect children to apologize if they are mean to another child,” Schoenebeck said. “However, adults routinely make bad decisions and treat each other terribly online, and we don’t see people apologizing very often. In some cases, an apology can be an appropriate and powerful way to amend those bad decisions.”

Not all participants wanted this remedy, however. For example, transgender participants and Hispanic or Latino respondents liked the idea of the apology less than other groups on average, perhaps because a nongenuine apology could magnify discrimination those groups experience.

The study found some participants also liked the idea of publicly shaming harassers.

“This may be because social media sites currently fail to support harassment targets in any meaningful way, so people want to take matters into their own hands,” Schoenebeck said.

However, Schoenebeck and colleagues caution that public shaming online can quickly go wrong.

“Most people are not trained to determine proportionate punishments, and large groups of people online may not make for very good juries and judges,” she said.

The study highlights how most social media sites, including Facebook, Twitter and Instagram, currently rely on two approaches to sanction harassers: remove the offensive content or ban the harassers (or both). The team found that participants favor both approaches, but with some exceptions. People who report that they have harassed others did not like either approach. In other words, people who harass others online don’t want to be banned.

Most people are not trained to determine proportionate punishments, and large groups of people online may not make for very good juries and judges
Sarita Schoenebeck

Participants who are American Indian or Alaska Native also did not like banning, perhaps due to their historical experiences of being forcibly removed from their own land, or more recent history of Facebook account bans due to names misaligned with the site’s “real name” policies, the researchers said.

People who identified as politically liberal, however, liked both user bans and content removal.

In general, participants did not like the idea of increasing or decreasing exposure to large audiences on the site. Those who are transgender, black or female, in particular, did not like the idea of more exposure.

Oliver Haimson

Oliver Haimson

“It could be that if exposed to a nonconsensual spotlight, some social media users may wish to remove themselves from the public eye rather than gaining a larger audience,” said Oliver Haimson, U-M assistant professor of information. “In the case of transgender people, widespread disclosure of their trans identity may render them especially vulnerable to violence and discrimination.”

The researchers highlight concerns about the limitations of existing approaches social media sites use: removing content and banning users. These are difficult to enforce because harassers easily can create a new account and continue with the harassment, especially on public sites like Twitter. They argue that these approaches, like the criminal justice models in the U.S., are more focused on punishing people than on educating or rehabilitating them.

The researchers suggest that alternatives to criminal justice systems, like racial justice or restorative justice, may be more effective at remediating harm to targets of harassment.

Parts of this research were funded by the National Science Foundation under Grant Nos. 1763297 and 1552503.

 

More information: