Online content moderators likely to experience burnout, U-M study suggests

March 8, 2023
Written By:
Noor Hindi, School of Information
Contact:
Concept illustration of a volunteer community manager. Image credit: Nicole Smith, made with Midjourney

Online communities play an important role in creating a sense of connection between strangers. But what happens when the people moderating our favorite online communities quit?

Why do they quit, and what strategies can companies like Meta and Reddit implement to help prevent burnout?

A study by University of Michigan School of Information researchers led by doctoral candidate Angela Schöpke-Gonzalez says volunteer content moderators, or VCMs, experience burnout stemming from interpersonal conflict between moderators, time constraints and daily exposure to toxic online behavior.

“It’s the unpaid labor of volunteer content moderators that make it possible for us, in many cases, to enjoy environments that support our well-being,” Schöpke-Gonzalez said. “We browse the internet every day and many people are on social media platforms, but we often forget that it’s people that are responsible for keeping our information ecosystems alive.”

The study aims to point attention to the critical roles of VCMs, to explore what causes burnout and to help companies begin to understand how they can better support VCMs in order to help prevent psychological distress.

Angela Schopke-Gonzalez
Angela Schopke-Gonzalez

“VCMs experience many of the same psychological distress challenges as crisis hotline volunteer responders, caregivers and volunteer support providers for persons who have experienced violence,” Schöpke-Gonzalez said. “Researchers, platforms and moderators can learn from work addressing psychological distress among similar volunteer groups to craft interventions that support VCMs.”

Schöpke-Gonzalez’s research focuses on how computational social science research processes can avoid perpetuating social harms. She cited the example of an African American man wrongfully accused of stealing watches at a Shinola store in Detroit in 2020.

“What steps can computational social science research take to mitigate the risk of facial recognition algorithms’ use in law enforcement leading to wrongful detentions like that of Michigander Robert Julian-Borchak Williams?” she said.

Schöpke-Gonzalez is working on her doctoral dissertation with Libby Hemphill, U-M associate professor of information, who co-authored the study along with doctoral candidate Shubham Atreja and former research assistants Han Na Shin and Najmin Ahmed.