Misinformation about the Israel-Hamas conflict is flooding social media, in particular Elon Musk’s platform X, where users have been sharing false and misleading claims about the assault. Gordon Pennycook, associate professor of psychology in the College of Arts & Sciences, studies misinformation. His research has investigated various interventions on social media, including accuracy prompts, fact-checking or debunking, crowdsourcing and labeling or warnings.
Pennycook says: “Every time there is some major event and information is at a premium, we see misinformation spread like wildfire. There is now a very consistent pattern, but every time it happens there's a sudden surge of concern about misinformation that tends to fade away once the moment passes. We need tools that help build resistance toward misinformation prior to events such as this.
“We can speed up fact-checking using the wisdom of the crowds. On X, the Community Notes program is actually quite successful – and has even fact-checked Elon Musk.
“All of us, as individuals, need to become more vigilant in terms of what we share online and, therefore, whether we contribute to the spread of misinformation. A simple step is to just verify that you know something is accurate before sharing it.
“People are desperate for information and social media context may actively interfere with people's ability to distinguish fact from fiction. That is, when people are deciding what to share, the social media context leads them to prioritize factors that are often orthogonal to accuracy. For example, is it surprising and interesting? Does it send the right message to my followers? What people need to do instead – and this is even more true in cases where major events occur, not to mention ones that are related to longstanding cultural conflicts – is first ask themselves ‘do I know if this is true?’”
For interviews contact Becka Bowyer, cell (607) 220-4185, email@example.com.