Social media actions following Capitol riot: U-M experts can discuss
The University of Michigan has experts who can talk about social media deactivation of President Trump’s accounts and the move by Google, Apple and Amazon to suspend the Parler app and the company’s resulting shutdown of the site.
Libby Hemphill, associate professor of information and associate director of the Center for Social Media Responsibility, can discuss political communication through social media, as well as civic engagement, digital curation and data stewardship.
“Finally deplatforming Trump was a big move for social media platforms,” she said. “Coupled with other actions like shuttering QAnon groups and propaganda accounts ahead of the elections in Uganda, I hope that we’re seeing platforms step up to meet their public obligations. However, I don’t expect them to continue holding folks accountable unless extremists and disinformation campaigns stay bad for business.
“We should definitely consider whether three companies ought to have this much power over our communication networks, but Apple, Google and Amazon finally flexed their market muscles. They could do more to root out apps and customers who violate their terms, but deplatforming Parler was a good start.”
Contact: 734-615-9524, firstname.lastname@example.org, @libbyh
Sarita Schoenebeck, associate professor of information, specializes in research on social computing, social media and human-computer interaction. Several of her studies have focused on online harassment.
“For years, platforms have evaluated what kinds of content are appropriate or not by evaluating the content in isolation, without considering the broader social and cultural context that it takes place in,” she said. “This means that harmful content persists on the site, and content that should be acceptable may be removed. We need to revisit this approach. We should rely on a combination of democractic principles, community governance and platform rules to shape behavior.
“We also should center commitments to equity and justice in how platforms regulate behavior. Allowing some people to engage in hate speech and violence simply means that others can no longer participate safely or equitably, and that is not the kind of society—whether online or offline—that we should aspire to.”
Cliff Lampe, professor of information, studies the social and technical structures of large-scale technology-mediated communication, working with sites like Facebook, Wikipedia, Slashdot and Everything2. He has also been involved in the creation of multiple social media and online community projects. His current work looks at how the design of social media platforms encourages moderation, misinformation and social development.
“This is a formative moment for social media companies,” he said. “They have the obligation and right to police their platforms for the type of content they want to host. Still, many people feel a lack of agency, since the power of the platform can feel overwhelming to the individual and group. How social media platforms navigate this over the next few months could define the industry for a decade.”
Contact: 517-515-2494, email@example.com
Josh Pasek is an associate professor of communication & media and political science, faculty associate in the Center for Political Studies and core faculty for the Michigan Institute for Data Science at U-M. His research explores how new media and psychological processes shape political attitudes, public opinion and political behaviors. Current research explores how both accurate and inaccurate information might influence public opinion and voter decision-making and evaluates whether the use of online social networking sites such as Facebook and Twitter might be changing the political information environment.
“Tech companies are reacting to the events of this past week from a few different perspectives. Yes, individuals at these institutions are outraged at what they saw, but that isn’t really driving the change in policy. Most critically, these companies are afraid of the potential for regulation,” he said. “When they were accused of spreading misinformation, they reacted by providing minimal fact checking on claims that might mitigate that criticism. When they were being criticized for supposedly stifling voices on the political right, they bent over backwards to ensure that their policies would not have a disproportionate political impact even though the violations of those policies were far from politically neutral. And when they read the tea leaves last week, it had become clear that the most likely source of future regulation was from a Democratic Congress that would be more worried about dangerous information and incitement than ensuring that even the extremes of the political spectrum had a platform.
“Social media companies really don’t want to play a police role, but they are far more worried about regulation than about playing that role. That said, it may be a good thing, because it is really not clear that there are any other actors that we would trust more to do it.”
“As for the future, I think additional regulation is coming regardless, so the big question is what that process ends up looking like, and how the public and politicians feel about the way the companies have balanced these concerns over time.”
Contact: Josh Pasek, 484-557-4594, firstname.lastname@example.org