Facebook under fire: U-M experts available

March 28, 2018
Written By:
Laurel Thomas
Contact:

EXPERTS ADVISORY

As the Facebook story develops, University of Michigan experts are available to comment on ethics in data science, potential pitfalls of privacy policies, social media responsibility, Facebook’s need for a new funding model, and more.

Garlin Gilchrist

Garlin Gilchrist

Garlin Gilchrist is the executive director of the new School of Information Center for Social Media Responsibility. The center will make U-M research usable to media makers, media consumers and platform companies, and will produce designs, systems and metrics that aim to steer social media use toward more civil and beneficial discourse.

“With great power and great reach comes great responsibility. Facebook and the social media platforms that dominate the attention economy are now being called to understand what that responsibility entails,” he said.

“The social contract between users and the platform companies has been strained: it’s time to re-negotiate. It will take a collective effort that includes researchers like the experts we have at the Center for Social Media Responsibility, decision-makers platform companies, engineers, policymakers, media companies and users to establish new norms, better understand roles, and create real accountability mechanisms for all parties.”

Contact: 734-763-2285, garlin@umich.edu


Erik Gordon

Erik Gordon

Erik Gordon is a clinical assistant professor at the Ross School of Business who focuses on entrepreneurship and technology commercialization.

“The data scandal means that Facebook has to consider a new model in which it is responsible for content. That is a costly model. It’s a model that will lose posters and viewers,” he said. “Facebook’s revenue model would be significantly hurt by regulatory changes in how it can turn data on users into revenue and by users getting off Facebook or using it less because of data privacy concerns.

“Facebook is not the only company facing threats. Twitter could be hurt even more because it may gather a higher proportion of its revenue from data than at Facebook.”

Contact: 734-764-5274, rmegordo@umich.edu


Peter Honeyman

Peter Honeyman

Peter Honeyman is a research professor in computer science and engineering.

“Is there a lesson to be learned? It’s an old one: if you’re not paying for it, then you’re not a customer, you’re the product,” he said. “I concluded some years ago that the only safe and realistic assumption to make is that loading anything onto Facebook—pictures, contacts, personal information, etc.— is equivalent to publishing that information to the world.

“So while I remain an occasional Facebook user, I use that assumption to limit my exposure in a few ways. I also try to disable any Facebook extensions that make it easy for other applications to hoover up all my Facebook data. One way is to decline to use Facebook login to access other web services. Another is to disable apps, websites and plugins in the Facebook App Settings.”

Contact: 734-763-4413, honey@umich.edu


 H V Jagadish

H V Jagadish

H V Jagadish is a professor of computer science and engineering and expert on ethics in computer science. He has developed a MOOC on Ethics in Data Science, available through multiple platforms.

“Think about what good salespeople do,” he said. “They try to determine what matters to each customer, and customize their pitch to best address the customer’s issues of importance. Now put this idea on steroids. Recent research shows that by observing a few hundred of your Facebook ‘likes,’ it is possible for an algorithm to predict your personal preferences better than your spouse can predict them.

“It is the role of politicians, and those who serve them, to figure out how to appeal to us and win our votes. It is the goal of businesses to persuade us to buy from them. They can hardly be blamed for trying to do this more effectively. The problem is that they now have tools so effective that the persuasion happens on a playing field we perceive as not being even.

“So, what is to be done? I believe that the solution lies in our developing ethical principles regarding the analysis and use of data.”

Contact: 734-763-4079, jag@umich.edu


Harsha Madhyastha

Harsha Madhyastha

Harsha Madhyastha is an associate professor of computer science and engineering and an expert on distributed systems, networking, and security and privacy.

“Unfortunately, there are no obvious lessons,” he said. “One could argue for stronger default privacy protections. But, on the other hand, having social networks prevent app developers from accessing data shared by users’ friends must be weighed against the implications of how this would constrain the types of apps that can be developed.

“For example, a few years ago, we developed a Facebook app called MyPageKeeper to flag spam and malicious posts. When a user installed MyPageKeeper, it was vital for the app to be able to access posts made by the user’s friends for two reasons. First, by flagging unsafe posts made by a user’s friends, MyPageKeeper was able to protect users from posts that show up in their news feeds. Second, MyPageKeeper’s comments on unsafe posts enabled others to discover the app and install it to keep themselves safe. If Facebook enforced that no app can access information shared by the user’s friends, then useful apps like MyPageKeeper will no longer be feasible.”

Contact: 734-647-8086, harshavm@umich.edu


Aviv Ovadya

Aviv Ovadya

Aviv Ovadya, chief technology officer for the Center for Social Media Responsibility, is a misinformation engineering and design consultant who predicted the 2016 fake news crisis. Ovadya, a Knight News Innovation Fellow at the Tow Center for Digital Journalism at Columbia University, sounded the alarm about the vulnerability of social media platforms to propaganda, misinformation and dark targeted advertising from foreign governments months before the November 2016 presidential election.

“Even as we push Facebook and similar companies to take on more responsibility, we need to be mindful that sometimes pushing too hard for one thing, such as privacy, may have tradeoffs where that impacts data portability, or the ability to determine if foreign actors are manipulating online discourse—or even the ability to hold Facebook accountable itself,” he said. “If even aggregated Facebook data is so private that only Facebook can see it, then how can independent bodies determine if Facebook is acting responsibly?”

Contact: avivo@umich.edu


Will Potter

Will Potter

Will Potter, the Howard R. Marsh Visiting Professor of Journalism, teaches courses on investigative journalism, social movements and whistleblowing. As a professional journalist, his writing and opinions have appeared in the Washington Post, CNN, National Geographic, Le Monde, The Sydney Morning Herald, VICE and Rolling Stone. He is best known for his work challenging government repression and the labeling of protest as “terrorism.”

“When someone violates your privacy for profit, like we’re seeing with Facebook and Cambridge Analytica, it feels like you’ve been robbed. But we have to remember that we gave Facebook the keys to our personal information,” he said. “And it doesn’t stop there.

“Facebook is just one of many social media platforms aggregating our lives, and most users accept these companies’ terms of use without having read them. Unless we hold these companies accountable, they will continue to dominate others aspects of our lives. This data breach should put an end to any possibility of Facebook being used for voting, and it’s an opportunity for all of us to rethink the trust we have put in social media companies.”

Contact: 734-764-7718, wpot@umich.edu


 

Ceren Budak

Ceren Budak

Ceren Budak, assistant professor of information, and electrical engineering and computer science, is researching how much of Twitter is dubious and the spread of fake, hyperpartisan and clickbait messaging in the form of news.

“While it is important to investigate ways through which we can keep platforms accountable, we should also not overlook the importance of building solutions to improve technological literacy of our citizens,” she said. “As individuals increasingly interact with each other and with information through online platforms—Facebook being only one of these platforms—identifying common misconceptions about how these platforms work and overcoming them through informational tools come to be all the more important.”

Contact: 734-763-3384, cbudak@umich.edu