Explore All Articles

By Topic

By Author

All Articles

Article Topic

People rely on their existing political beliefs to identify election misinformation

Sora Park, Jee Young Lee, Kieran McGuinness, Caroline Fisher and Janet Fulton

Rather than assuming that people are motivated to fact-check, we investigated the process that people go through when and if they encounter political misinformation. Using a digital diary method, we asked 38 participants to collect examples of political misinformation during Australia’s 2025 federal election and explain why they determined it to be misinformation (n = 254).

Keep Reading

Emotional resonance and participatory misinformation: Learning from a K-pop controversy

Sungha Kang, Rachel E. Moran and Jin Ha Lee

In today’s digital media environment, emotionally resonant narratives often spread faster and stick more firmly than verifiable facts. This paper explores how emotionally charged communication in online controversies fosters not only widespread engagement but also the participatory nature of misinformation. Through a case study of a K-pop controversy, we show how audiences act not just as consumers but as co-authors of alternative narratives in moments of uncertainty.

Keep Reading
Research Note

Prebunking misinformation techniques in social media feeds: Results from an Instagram field study

Sander van der Linden, Debra Louison-Lavoy, Nicholas Blazer, Nancy S. Noble and Jon Roozenbeek

Boosting psychological defences against misleading content online is an active area of research, but transition from the lab to real-world uptake remains a challenge. We developed a 19-second prebunking video about emotionally manipulative content and showed it as a Story Feed ad to N = 375,597 Instagram users in the United Kingdom.

Keep Reading
Commentary

Reframing misinformation as informational-systemic risk in the age of societal volatility

Nuurrianti Jalli

When a bank run, a pandemic, or an election spirals out of control, the spark is often informational. In 2023, rumors online helped accelerate the collapse of Silicon Valley Bank. During COVID-19, false claims about vaccines fueled preventable harms by undermining public trust in health guidance, and election lies in the United States fed into the broader dynamics that culminated in the January 6 Capitol attack.

Keep Reading
Commentary

Towards the study of world misinformation

Piero Ronzani

What if nearly everything we think we know about misinformation came from just a sliver of the world? When research leans heavily on online studies from a few wealthy nations, we risk drawing global conclusions from local noise. A WhatsApp group of fishermen, a displaced community in a refugee camp, or a bustling market in the Global South are not marginal examples of information environments; such contexts call for an evolution of how we study misinformation.

Keep Reading
Research Note

Information control on YouTube during Russia’s invasion of Ukraine

Yevgeniy Golovchenko, Kristina Aleksandrovna Pedersen, Jonas Skjold Raaschou-Pedersen and Anna Rogers

This research note investigates the aftermath of YouTube’s global ban on Russian state-affiliated media channels in the wake of Russia’s full-scale invasion of Ukraine in 2022. Using over 12 million YouTube comments across 40 Russian-language channels, we analyzed the effectiveness of the ban and the shifts in user activity before and after the platform’s intervention.

Keep Reading
Research Note

People are more susceptible to misinformation with realistic AI-synthesized images that provide strong evidence to headlines

Sean Guo, Yiwen Zhong and Xiaoqing Hu

The development of artificial intelligence (AI) allows rapid creation of AI-synthesized images. In a pre-registered experiment, we examine how properties of AI-synthesized images influence belief in misinformation and memory for corrections. Realistic and probative (i.e., providing strong evidence) images predicted greater belief in false headlines.

Keep Reading

Not so different after all? Antecedents of believing in misinformation and conspiracy theories on COVID-19

Florian Wintterlin

Misinformation and conspiracy theories are often grouped together, but do people believe in them for the same reasons? This study examines how these conceptually distinct forms of deceptive content are processed and believed using the COVID-19 pandemic as context. Surprisingly, despite their theoretical differences, belief in both is predicted by similar psychological factors—particularly conspiracy mentality and the perception that truth is politically constructed—suggesting that underlying distrust in institutions may outweigh differences in types of deceptive content in shaping susceptibility.

Keep Reading
Research Note

LLMs grooming or data voids? LLM-powered chatbot references to Kremlin disinformation reflect information gaps, not manipulation

Maxim Alyukov, Mykola Makhortykh, Alexandr Voronovici and Maryna Sydorova

Some of today’s most popular large language model (LLM)-powered chatbots occasionally reference Kremlin-linked disinformation websites, but it might not be for the reasons many fear. While some recent studies have claimed that Russian actors are “grooming” LLMs by flooding the web with disinformation, our small-scale analysis finds little evidence for this.

Keep Reading