- Gordon Pennycook Gordon Pennycook, assistant professor of behavioural science at the University of Regina, is researching why people share disinformation on social media and what would make people think more critically about the information they share.
Social media can become a whirlpool of information both true and false, but a lot of disinformation is only circulated because people don’t think critically before they click the share button, according to a University of Regina professor and researcher.
Gordon Pennycook, an assistant professor of behavioural science at the U of R, has been researching why disinformation — false information intended to mislead people — is so widely shared on social media and what can be done to prevent it from spreading.
At the heart of that research is a single question: Why do people believe and share false content online?
While many may think people share fake news on social media to push a certain political or ideological agenda, Pennycook said this does not appear to be the case.
“What we found is that’s not really a good explanation of what’s going on. Rather, the evidence was more in line with the idea that people are just not thinking that much when they’re on social media,” he said.
“It’s mostly a lazy thinking problem.”
In an experiment, Pennycook asked one group of people if they thought a Facebook post with a news headline was accurate, and did not mention accuracy to the second group. He then presented both with different Facebook posts and asked which ones they would share.
The group who had been asked to consider accuracy in a previous headline were “more discerning” in what they chose to share afterward, he said. He added this shows a simple reminder for people to think critically about the information being shared on social media is sometimes enough to prevent its spread.
Pennycook recently received more than $316,000 from the Social Sciences and Humanities Research Council to further research the topic and what effective interventions would look like on social media to encourage more critical thinking.
While he noted that more research needs to be done into effective interventions, he said they may not be complicated solutions.
“Getting them to stop and think is the first step,” he said.
Social media platforms running ads reminding users of the importance of accuracy could work, or even an alert prompting users to consider if the information in the post is true right before they click to share. But these methods only work if the person can discern fact from fiction. Pennycook suggested some basic education on how to recognize disinformation could also be helpful, like encouraging people to check the source or ask themselves if the information makes sense.
Ideally, Pennycook would like to see social media companies implement some kind of simple intervention to encourage people to consider the accuracy of the information they’re sharing.
“The hope is just that with enough pressure and evidence for the effectiveness, eventually it will convince them to do it, so that’s the goal,” he said.