Explore Commentaries
All Articles
Article Topic

New sources of inaccuracy? A conceptual framework for studying AI hallucinations
Anqi Shao
In February 2025, Google’s AI Overview fooled itself and its users when it cited an April Fool’s satire about “microscopic bees powering computers” as factual in search results (Kidman, 2025). Google did not intend to mislead, yet the system produced a confident falsehood.

Disparities by design: Toward a research agenda that links science misinformation and socioeconomic marginalization in the age of AI
Miriam Schirmer, Nathan Walter and Emőke-Ágnes Horvát
Misinformation research often draws optimistic conclusions, with fact-checking, for example, being established as an effective means of reducing false beliefs. However, it rarely considers the details of socioeconomic disparities that often shape who is most vulnerable to science misinformation. Historical and systemic inequalities have fostered mistrust in institutions, limiting access to credible information, for example, when Black patients distrust public health guidance due to past medical racism.

Gendered disinformation as violence: A new analytical agenda
Marília Gehrke and Eedan R. Amit-Danhi
The potential for harm entrenched in mis- and disinformation content, regardless of intentionality, opens space for a new analytical agenda to investigate the weaponization of identity-based features like gender, race, and ethnicity through the lens of violence. Therefore, we lay out the triangle of violence to support new studies aiming to investigate multimedia content, victims, and audiences of false claims.

Conspiracy Theories
The climate lockdown conspiracy: You can’t fact-check possibility
Michael P. A. Murphy
The climate lockdown conspiracies claim that a clandestine group of elites are planning to use climate change as a justification to enact widespread lockdowns and curtail freedoms. This conspiracy draws on a wide range of unconnected real-world events and suggests that their possibility of happening again is all the proof required.

Conspiracy Theories
Are conspiracy beliefs a sign of flawed cognition? Reexamining the association of cognitive style and skills with conspiracy beliefs
Roland Imhoff and Tisa Bertlich
Throughout human history, political leaders, oppositional forces, and businesspeople have frequently coordinated in secret for their own benefit and the public’s disadvantage. In these cases, conspiracy theories are capable of accurately describing our environment. However, the vast majority of research today operationalizes conspiracy theories as irrational beliefs that contradict our everyday knowledge.

Misinformed about misinformation: On the polarizing discourse on misinformation and its consequences for the field
Irene V. Pasquetto, Gabrielle Lim and Samantha Bradshaw
The field of misinformation is facing several challenges, from attacks on academic freedom to polarizing discourse about the nature and extent of the problem for elections and digital well-being. However, we see this as an inflection point and an opportunity to chart a more informed and contextual research practice.

Beyond the deepfake hype: AI, democracy, and “the Slovak case”
Lluis de Nadal and Peter Jančárik
Was the 2023 Slovakia election the first swung by deepfakes? Did the victory of a pro-Russian candidate, following the release of a deepfake allegedly depicting election fraud, herald a new era of disinformation? Our analysis of the so-called “Slovak case” complicates this narrative, highlighting critical factors that made the electorate particularly susceptible to pro-Russian disinformation.

Misinformation reloaded? Fears about the impact of generative AI on misinformation are overblown
Felix M. Simon, Sacha Altay and Hugo Mercier
Many observers of the current explosion of generative AI worry about its impact on our information environment, with concerns being raised about the increased quantity, quality, and personalization of misinformation. We assess these arguments with evidence from communication studies, cognitive science, and political science.

A focus shift in the evaluation of misinformation interventions
Li Qian Tay, Stephan Lewandowsky, Mark J. Hurlstone, Tim Kurz and Ullrich K. H. Ecker
The proliferation of misinformation has prompted significant research efforts, leading to the development of a wide range of interventions. There is, however, insufficient guidance on how to evaluate these interventions. Here, we argue that researchers should consider not just the interventions’ primary effectiveness but also ancillary outcomes and implementation challenges.

Mis- and disinformation studies are too big to fail: Six suggestions for the field’s future
Chico Q. Camargo and Felix M. Simon
Who are mis-/disinformation studies for? What agenda does the field serve? How can it be improved? While the increase in the attention towards the topic in the last years is healthy, it has also led to an explosion of papers in all directions, and the field has been subject to various criticisms and attacks.