Explore All Articles
All Articles
Article Topic

Fact-opinion differentiation
Matthew Mettler and Jeffery J. Mondak
Statements of fact can be proved or disproved with objective evidence, whereas statements of opinion depend on personal values and preferences. Distinguishing between these types of statements contributes to information competence. Conversely, failure at fact-opinion differentiation potentially brings resistance to corrections of misinformation and susceptibility to manipulation.

Debunking and exposing misinformation among fringe communities: Testing source exposure and debunking anti-Ukrainian misinformation among German fringe communities
Johannes Christiern Santos Okholm, Amir Ebrahimi Fard and Marijn ten Thij
Through an online field experiment, we test traditional and novel counter-misinformation strategies among fringe communities. Though generally effective, traditional strategies have not been tested in fringe communities, and do not address the online infrastructure of misinformation sources supporting such consumption. Instead, we propose to activate source criticism by exposing sources’ unreliability.

Seeing lies and laying blame: Partisanship and U.S. public perceptions about disinformation
Kaitlin Peach, Joseph Ripberger, Kuhika Gupta, Andrew Fox, Hank Jenkins-Smith and Carol Silva
Using data from a nationally representative survey of 2,036 U.S. adults, we analyze partisan perceptions of the risk disinformation poses to the U.S. government and society, as well as the actors viewed as responsible for and harmed by disinformation. Our findings indicate relatively high concern about disinformation across a variety of societal issues, with broad bipartisan agreement that disinformation poses significant risks and causes harms to several groups.

Measuring what matters: Investigating what new types of assessments reveal about students’ online source evaluations
Joel Breakstone, Sarah McGrew and Mark Smith
A growing number of educational interventions have shown that students can learn the strategies fact checkers use to efficiently evaluate online information. Measuring the effectiveness of these interventions has required new approaches to assessment because extant measures reveal too little about the processes students use to evaluate live internet sources.

Correcting campaign misinformation: Experimental evidence from a two-wave panel study
Laszlo Horvath, Daniel Stevens, Susan Banducci, Raluca Popp and Travis Coan
In this study, we used a two-wave panel and a real-world intervention during the 2017 UK general election to investigate whether fact-checking can reduce beliefs in an incorrect campaign claim, source effects, the duration of source effects, and how predispositions including political orientations and prior exposure condition them.

How different incentives reduce scientific misinformation online
Piero Ronzani, Folco Panizza, Tiffany Morisseau, Simone Mattavelli and Carlo Martini
Several social media employ or consider user recruitment as defense against misinformation. Yet, it is unclear how to encourage users to make accurate evaluations. Our study shows that presenting the performance of previous participants increases discernment of science-related news. Making participants aware that their evaluations would be used by future participants had no effect on accuracy.

What do we study when we study misinformation? A scoping review of experimental research (2016-2022)
Gillian Murphy, Constance de Saint Laurent, Megan Reynolds, Omar Aftab, Karen Hegarty, Yuning Sun and Ciara M. Greene
We reviewed 555 papers published from 2016–2022 that presented misinformation to participants. We identified several trends in the literature—increasing frequency of misinformation studies over time, a wide variety of topics covered, and a significant focus on COVID-19 misinformation since 2020. We also identified several important shortcomings, including overrepresentation of samples from the United States and Europe and excessive emphasis on short-term consequences of brief, text-based misinformation.

Increasing accuracy motivations using moral reframing does not reduce Republicans’ belief in false news
Michael Stagnaro, Sophia Pink, David G. Rand and Robb Willer
In a pre-registered survey experiment with 2,009 conservative Republicans, we evaluated an intervention that framed accurate perceptions of information as consistent with a conservative political identity and conservative values (e.g., patriotism, respect for tradition, and religious purity). The intervention caused participants to report placing greater value on accuracy, and placing greater value on accuracy was correlated with successfully rating true headlines as more accurate than false headlines.

“Fact-checking” fact checkers: A data-driven approach
Sian Lee, Aiping Xiong, Haeseung Seo and Dongwon Lee
This study examined four fact checkers (Snopes, PolitiFact, Logically, and the Australian Associated Press FactCheck) using a data-driven approach. First, we scraped 22,349 fact-checking articles from Snopes and PolitiFact and compared their results and agreement on verdicts. Generally, the two fact checkers agreed with each other, with only one conflicting verdict among 749 matching claims after adjusting minor rating differences.

Exploring partisans’ biased and unreliable media consumption and their misinformed health-related beliefs
Natasha Strydhorst, Javier Morales-Riech and Asheley R. Landrum
This study explores U.S. adults’ media consumption—in terms of the average bias and reliability of the media outlets participants report referencing—and the extent to which those participants hold inaccurate beliefs about COVID-19 and vaccination. Notably, we used a novel means of capturing the (left-right) bias and reliability of audiences’ media consumption, leveraging the Ad Fontes Media ratings of 129 news sources along each dimension.