Many people who share fake news online do so because they arenโt paying close attention to what theyโre sharing, according to a new study. The research found that simply prompting people to think about the accuracy of their news content helps curtail the spread ofย falsehoods.ย
โWhen deciding what to share on social media, people are often distracted from considering the accuracy of the content,โ the authors, from the Hill/Levene Schools of Business at the University of Regina and the Sloan School of Management at the Massachusetts Institute of Technology (MIT), wrote in the new paper published in Nature.
While the spread of inaccurate or false information and conspiracy theories is nothing newย โ the climate denying disinformation campaign by the fossil fuel industry dates back decadesย โ the studyโs findings undercut the notion that there is a widespread desire among the public to actively spread disinformation. Rather, it adds further evidence showing how social media allows for fake news to spread rapidlyย โ and how to slow itย down.
Online disinformation seemed to hit a fever pitch in the past year, with the spread of the violent QAnon conspiracy, Covid denial, 2020 election conspiracies, and pro-insurrection voices all intermingling and cross-pollinating.
But instead of the malign actors involved in creating disinformationย โ such as the Koch-backed network of think tanks, charities and politicians seeking to undermine climate science, or, more recently, coordinated social media campaigns and troll farms, sometimes backed by government intelligence agencies, aimed at undermining elections around the worldย โ the new Nature study looks at the much larger set of everyday social media users who share this type of misinformation online, often unwittingly, or at least not with malicious intent. The results offer some reasons for hope, as well as some tools to fightย disinformation.
The study surveyed thousands of U.S. Twitter and Facebook users. It found that most people do not wish to spread fake news and, in fact, they rate accuracy as an important principle. When asked about what motivates sharing, participants rated accuracy higher than other factors, such as whether a piece of news was interesting, funny, or politically-aligned with their beliefs. Moreover, most people are fairly good at identifying and distinguishing accurate news from false news. In addition, most people do not share inaccurate news for hyperpartisan reasonsย either.
Instead, what the researchers found was that many people spread fake news without thinking too much about whether the information is accurate or not. Itโs a problem of inattention, made worse by social media which pushes people to sift through news rapidly andย superficially.
โThis means that when thinking about the rise of misinformation online, the issue is not so much a shift in peopleโs attitudes about truth, but rather a more subtle shift in attention to truth,โ two of the studyโs authors, David Rand of MIT and Gordon Pennycook of the University of Regina, wrote in Scientific American, summarizing their findings (emphasis inย original).
The researchers conducted several experiments to parse out contributing factors. In one experiment, participants were given a mix of true and false news stories, and one group of participants was asked to decide whether the headlines were accurate and another group was asked whether they would share them on socialย media.
Interestingly, the participants looking for accuracy did a reasonably good job identifying accurate stories from fake ones. In one experiment, participants rated true stories as accurate more often than false stories by a margin of 55 percentageย points.
But the group weighing whether or not to share a story chose to share fake stories at a much higher rate compared to when they were only asked to weigh accuracy. When looking only at false headlines, 50 percent more were shared than were rated asย accurate.
In other words, when asked about accuracy, people were good at spotting accurate versus fake stories. But when asked about sharing, people chose to share more stories, even fake ones. And they chose to share stories that fit their political views at a much higher rate (by 19 percentage points) than stories that went against their politicalย beliefs.
That would seem to suggest an ideological or partisan motivation. But the authors conducted another experiment, with over 5,000 participants on Twitter who had previously shared news from Breitbart and Infowars, two sites professional fact-checkers have rated as highly untrustworthy. The authors sent a private Twitter message to the participants and asked them to judge whether or not a single non-political headline wasย accurate.
The researchers then monitored the participantsโ subsequent sharing behavior and found a significant improvement in sharing choices; in the 24 hours after the prompt, participants shared relatively more news from reliable outlets such as CNN and relatively less from sources of inaccurate information likeย Infowars.
The authors surmise that simply redirecting attention towards the concept of accuracy helped cut down on sharing of false information. โ[W]e find that the single accuracy message made users more discerning in their subsequent sharing decisions,โ they wrote in their study. โRelative to baseline, the accuracy message increased the average quality of the news sourcesย shared.โ
The researchers replicated these experiments with Covid-19 information and found a similarย dynamic.
The study shows that there is a disconnect between what people share and what they consider to be accurate, suggesting that people share content in which they themselves might not necessarilyย believe.ย
These studies help us see past the illusion that everyday citizens on the other side must be either stupid or evil- instead, we are often simply distracted from accuracy when online. Another implication of our results is that widely-RTed claims are not necessarily widely BELIEVED
โ David G. Rand (@DG_Rand) March 17, 2021
Individuals scroll quickly through a social media news feed, which tends to be mixed with accurate and inaccurate information, along with emotionally engaging content. And crucially, the authors wrote, they are provided with โinstantaneous and quantified social feedback on sharing.โ The quest for retweets and likes, in other words, โmay discourage people from reflecting onย accuracy.โ
Rather than a wholesale rejection of truth, people lazily pass on inaccurate information because that tends to be what is rewarded on socialย media.ย
The good news was that even small interventionsย โ the prompt asking whether or not headlines were accurateย โ redirected people away from a tendency to share false information. This suggests that social media platforms could, perhaps, periodically survey people on the accuracy of selected headlines in an effort to subtlety remind users about accurate information, the authorsย say.
Twitter has recently been taking steps to slow the spread of misinformation. Last year, the social media platform introduced a new feature that reminds people to read an article before retweeting it, which it says has promising results. The platform also began tagging misleading tweets withย disclaimers.
The new study’s authors concede that the research is limited to sharing of political news among people in the United States. They note that follow up research could examine the impact of subtle accuracy nudges when coordinated disinformation campaigns are in question, such as the climate denial or election fraud, which are backed by groups actively working to promote aย falsehood.
In a recent analysis, DeSmog found that dozens of prominent climate deniers supported the January 6 insurrection in Washington D.C. They spread debunked claims about election fraud and in some cases supported political violence. This is the type of campaign that was then likely shared by many more people who, as the Nature study illustrates, may have shared the content without taking time to think about itsย accuracy.
Experts have identified tools and methods for protection against malicious disinformation campaigns, such as โprebunking,โ which involves learning about the tactics and tricks of bad actors before you are exposed to them. However, such campaigns of weaponized disinformation are potentially more challenging to combat when compared to one-off fake newsย stories.ย
Main image: Social media apps. Credit: Jason Howie (CC BY 2.0)
Subscribe to our newsletter
Stay up to date with DeSmog news and alerts