Explore All Articles
All Articles
Article Topic

Journalistic interventions matter: Understanding how Americans perceive fact-checking labels
Chenyan Jia and Taeyoung Lee
While algorithms and crowdsourcing have been increasingly used to debunk or label misinformation on social media, such tasks might be most effective when performed by professional fact checkers or journalists. Drawing on a national survey (N = 1,003), we found that U.S. adults evaluated fact-checking labels created by professional fact checkers as more effective than labels by algorithms and other users. News

Did the Musk takeover boost contentious actors on Twitter?
Christopher Barrie
After his acquisition of Twitter, Elon Musk pledged to overhaul verification and moderation policies. These events sparked fears of a rise in influence of contentious actors—notably from the political right. I investigated whether these actors did receive increased engagement over this period by gathering tweet data for accounts that purchased blue-tick verification before and after the Musk takeover.

A survey of expert views on misinformation: Definitions, determinants, solutions, and future of the field
Sacha Altay, Manon Berriche, Hendrik Heuer, Johan Farkas and Steven Rathje
We surveyed 150 academic experts on misinformation and identified areas of expert consensus. Experts defined misinformation as false and misleading information, though views diverged on the importance of intentionality and what exactly constitutes misinformation. The most popular reason why people believe and share misinformation was partisanship, while lack of education was one of the least popular reasons.

Research note: This salesperson does not exist: How tactics from political influence operations on social media are deployed for commercial lead generation
Josh A. Goldstein and Renée DiResta
Researchers of foreign and domestic influence operations document tactics that frequently recur in covert propaganda campaigns on social media, including backstopping fake personas with plausible biographies or histories, using GAN-generated images as profile photos, and outsourcing account management to paid organizations.

Measuring the effect of Facebook’s downranking interventions against groups and websites that repeatedly share misinformation
Emmanuel M. Vincent, Héloïse Théro and Shaden Shabayek
Facebook has claimed to fight misinformation notably by reducing the virality of posts shared by “repeat offender” websites. The platform recently extended this policy to groups. We identified websites and groups that repeatedly publish false information according to fact checkers and investigated the implementation and impact of Facebook’s measures against them.

Self-regulation 2:0? A critical reflection of the European fight against disinformation
Ethan Shattock
In presenting the European Democracy Action Plan (EDAP) in 2020, the European Commission pledged to build more resilient democracies across the EU. As part of this plan, the Commission announced intensified measures to combat disinformation, both through the incoming Digital Services Act (DSA) and specific measures to address sponsored content online.

COVID-19
The Twitter origins and evolution of the COVID-19 “plandemic” conspiracy theory
Matthew D. Kearney, Shawn C. Chiang and Philip M. Massey
Tweets about “plandemic” (e.g., #plandemic)—the notion that the COVID-19 pandemic was planned or fraudulent—helped to spread several distinct conspiracy theories related to COVID-19. But the term’s catchy nature attracted attention from anti-vaccine activist filmmakers who ultimately created Plandemic the 26-minute documentary.

COVID-19
The spread of COVID-19 conspiracy theories on social media and the effect of content moderation
Orestis Papakyriakopoulos, Juan Carlos Medina Serrano and Simon Hegelich
We investigate the diffusion of conspiracy theories related to the origin of COVID-19 on social media. By analyzing third-party content on four social media platforms, we show that: (a) In contrast to conventional wisdom, mainstream sources contribute overall more to conspiracy theories diffusion than alternative and other sources; and (b) platforms’ content moderation practices are able to mitigate the spread of conspiracy theories.

Repress/redress: What the “war on terror” can teach us about fighting misinformation
Alexei Abrahams and Gabrielle Lim
Misinformation, like terrorism, thrives where trust in conventional authorities has eroded. An informed policy response must therefore complement efforts to repress misinformation with efforts to redress loss of trust. At present, however, we are repeating the mistakes of the war on terror, prioritizing repressive, technologically deterministic solutions while failing to redress the root sociopolitical grievances that cultivate our receptivity to misinformation in the first place.