Explore All Articles
All Articles
Article Topic
“Fact-checking” fact checkers: A data-driven approach
Sian Lee, Aiping Xiong, Haeseung Seo and Dongwon Lee
This study examined four fact checkers (Snopes, PolitiFact, Logically, and the Australian Associated Press FactCheck) using a data-driven approach. First, we scraped 22,349 fact-checking articles from Snopes and PolitiFact and compared their results and agreement on verdicts. Generally, the two fact checkers agreed with each other, with only one conflicting verdict among 749 matching claims after adjusting minor rating differences.
Exploring partisans’ biased and unreliable media consumption and their misinformed health-related beliefs
Natasha Strydhorst, Javier Morales-Riech and Asheley R. Landrum
This study explores U.S. adults’ media consumption—in terms of the average bias and reliability of the media outlets participants report referencing—and the extent to which those participants hold inaccurate beliefs about COVID-19 and vaccination. Notably, we used a novel means of capturing the (left-right) bias and reliability of audiences’ media consumption, leveraging the Ad Fontes Media ratings of 129 news sources along each dimension.
Assessing misinformation recall and accuracy perceptions: Evidence from the COVID-19 pandemic
Sarah E. Kreps and Douglas L. Kriner
Misinformation is ubiquitous; however, the extent and heterogeneity in public uptake of it remains a matter of debate. We address these questions by exploring Americans’ ability to recall prominent misinformation during the COVID-19 pandemic and the factors associated with accuracy perceptions of these claims.
Who knowingly shares false political information online?
Shane Littrell, Casey Klofstad, Amanda Diekman, John Funchion, Manohar Murthi, Kamal Premaratne, Michelle Seelig, Daniel Verdear, Stefan Wuchty and Joseph E. Uscinski
Some people share misinformation accidentally, but others do so knowingly. To fully understand the spread of misinformation online, it is important to analyze those who purposely share it. Using a 2022 U.S. survey, we found that 14 percent of respondents reported knowingly sharing misinformation, and that these respondents were more likely to also report support for political violence, a desire to run for office, and warm feelings toward extremists.
A survey of expert views on misinformation: Definitions, determinants, solutions, and future of the field
Sacha Altay, Manon Berriche, Hendrik Heuer, Johan Farkas and Steven Rathje
We surveyed 150 academic experts on misinformation and identified areas of expert consensus. Experts defined misinformation as false and misleading information, though views diverged on the importance of intentionality and what exactly constitutes misinformation. The most popular reason why people believe and share misinformation was partisanship, while lack of education was one of the least popular reasons.
Older Americans are more vulnerable to prior exposure effects in news evaluation
Benjamin A. Lyons
Older news users may be especially vulnerable to prior exposure effects, whereby news comes to be seen as more accurate over multiple viewings. I test this in re-analyses of three two-wave, nationally representative surveys in the United States (N = 8,730) in which respondents rated a series of mainstream, hyperpartisan, and false political headlines (139,082 observations).
Less reliable media drive interest in anti-vaccine information
Samikshya Siwakoti, Jacob N. Shapiro and Nathan Evans
As progress on vaccine rollout in the United States slowed down in Spring 2021, it became clear that anti-vaccine information posed a public health threat. Using text data from 5,613 distinct COVID misinformation stories and 70 anti-vaccination Facebook groups, we tracked highly salient keywords regarding anti-vaccine discourse across Twitter, thousands of news websites, and the Google and Bing search engines from May through June 2021, a key period when progress on vaccinations very clearly stalled.
How effective are TikTok misinformation debunking videos?
Puneet Bhargava, Katie MacDonald, Christie Newton, Hause Lin and Gordon Pennycook
TikTok provides opportunity for citizen-led debunking where users correct other users’ misinformation. In the present study (N=1,169), participants either watched and rated the credibility of (1) a misinformation video, (2) a correction video, or (3) a misinformation video followed by a correction video (“debunking”).
Examining accuracy-prompt efficacy in combination with using colored borders to differentiate news and social content online
Venya Bhardwaj, Cameron Martel and David G. Rand
Recent evidence suggests that prompting users to consider the accuracy of online posts increases the quality of news they share on social media. Here we examine how accuracy prompts affect user behavior in a more realistic context, and whether their effect can be enhanced by using colored borders to differentiate news from social content.
Search engine manipulation to spread pro-Kremlin propaganda
Evan M. Williams and Kathleen M. Carley
The Kremlin’s use of bots and trolls to manipulate the recommendation algorithms of social media platforms is well-documented by many journalists and researchers. However pro-Kremlin manipulation of search engine algorithms has rarely been explored. We examine pro-Kremlin attempts to manipulate search engine results by comparing backlink and keyphrase networks of US, European, and Russian think tanks, as well as Kremlin-linked “pseudo” think tanks that target Western audiences.