Reviewing evidence improves crowdworkers’ misinformation judgments, reduces partisan bias
People make better and less biased judgments about misinformation after searching the internet for corroborating evidence, according to a new University of Michigan study.
If members of a large panel of people, described as lay raters or crowdworkers, each make independent judgments after conducting online searches, the research indicates they would make better judgments than a small panel of journalists.
The study, published in Collective Intelligence, looked at how people evaluate news articles for potential misinformation. Researchers assembled liberal and conservative crowd raters to make judgments about 374 articles.
In a “no research” condition, people just viewed the articles and rendered judgments. In an “individual research” condition, they were also asked to search for corroborating evidence and provide a link to the best evidence they found. In a “collective research” condition, they were not asked to search but to review links collected from workers in the individual research condition.
Both research conditions reduced partisan disagreement in judgments, the findings indicated. The individual research condition was most effective at producing alignment with journalists’ assessments. In this condition, the judgments of a panel of 16 or more crowdworkers were better than that of a panel of three journalists.
Paul Resnick, the study’s lead author and U-M professor of information, said the results show that juries composed of lay raters could be a valuable resource in assessing misinformation.
“In settings where it is expensive or impossible to assemble a panel of journalists, it may be reasonable to use the judgments of crowdworkers as a proxy for the ground truth, but only if you ask the crowdworkers to do some searching before making judgments,” he said.
Requiring raters to become more informed before rendering judgments about misinformation reduces partisanship and improves the quality of their ratings, Resnick said.
The study’s other authors were Aljohara Alfayez, U-M graduate; Jane Im, U-M doctoral student; and Eric Gilbert, U-M associate professor of information.