Explore All Articles
All Articles
Article Topic
Gamified inoculation reduces susceptibility to misinformation from political ingroups
Cecilie Steenbuch Traberg, Jon Roozenbeek and Sander van der Linden
Psychological inoculation interventions, which seek to pre-emptively build resistance against unwanted persuasion attempts, have shown promise in reducing susceptibility to misinformation. However, as many people receive news from popular, mainstream ingroup sources (e.g., a left-wing person consuming left-wing media) which may host misleading or false content, and as ingroup sources may be more persuasive, the impact of source effects on inoculation interventions demands attention.
User experiences and needs when responding to misinformation on social media
Pranav Malhotra, Ruican Zhong, Victor Kuan, Gargi Panatula, Michelle Weng, Andrea Bras, Connie Moon Sehat, Franziska Roesner and Amy Zhang
This study examines the experiences of those who participate in bottom-up user-led responses to misinformation on social media and outlines how they can be better supported via software tools. Findings show that users desire support tools designed to minimize time and effort in identifying misinformation and provide tailored suggestions for crafting responses to misinformation that account for emotional and relational context.
Did the Musk takeover boost contentious actors on Twitter?
Christopher Barrie
After his acquisition of Twitter, Elon Musk pledged to overhaul verification and moderation policies. These events sparked fears of a rise in influence of contentious actors—notably from the political right. I investigated whether these actors did receive increased engagement over this period by gathering tweet data for accounts that purchased blue-tick verification before and after the Musk takeover.
Does incentivization promote sharing “true” content online?
Hansika Kapoor, Sarah Rezaei, Swanaya Gurjar, Anirudh Tagat, Denny George, Yash Budhwar and Arathy Puthillam
In an online experiment in India, incentives for sharing factual posts increased sharing compared to no incentivization. However, the type of incentive (monetary or social) did not influence sharing behavior in a custom social media simulation. Curbing misinformation may not require substantial monetary resources; in fact, social media platforms can devise ways to socially incentivize their users for being responsible netizens who share true information.
Support for “doing your own research” is associated with COVID-19 misperceptions and scientific mistrust
Sedona Chinn and Ariel Hasell
Amid concerns about misinformation online and bias in news, there are increasing calls on social media to “do your own research.” In an abundant information environment, critical media consumption and information validation are desirable. However, using panel survey data, we find that positive perceptions toward “doing your own research” are associated with holding more misperceptions about COVID-19 and less trust in science over time.
Explaining beliefs in electoral misinformation in the 2022 Brazilian election: The role of ideology, political trust, social media, and messaging apps
Patrícia Rossini, Camila Mont’Alverne and Antonis Kalogeropoulos
The 2022 elections in Brazil have demonstrated that disinformation can have violent consequences, particularly when it comes from the top, raising concerns around democratic backsliding. This study leverages a two-wave survey to investigate individual-level predictors of holding electoral misinformation beliefs and the role of trust and information habits during the 2022 Brazilian elections.
Mapping the website and mobile app audiences of Russia’s foreign communication outlets, RT and Sputnik, across 21 countries
Julia Kling, Florian Toepfl, Neil Thurman and Richard Fletcher
Following Russia’s invasion of Ukraine, policymakers worldwide have taken measures to curb the reach of Russia’s foreign communication outlets, RT and Sputnik. Mapping the audiences of these outlets in 21 countries, we show that in the quarter before the invasion, at least via their official websites and mobile apps, neither outlet reached more than 5% of the digital populations of any of these countries each month.
Research note: This salesperson does not exist: How tactics from political influence operations on social media are deployed for commercial lead generation
Josh A. Goldstein and Renée DiResta
Researchers of foreign and domestic influence operations document tactics that frequently recur in covert propaganda campaigns on social media, including backstopping fake personas with plausible biographies or histories, using GAN-generated images as profile photos, and outsourcing account management to paid organizations.
Research note: Explicit voter fraud conspiracy cues increase belief among co-partisans but have broader spillover effects on confidence in elections
Benjamin A. Lyons and Kaitlyn S. Workman
In this pre-registered experiment, we test the effects of conspiracy cue content in the context of the 2020 U.S. elections. Specifically, we varied whether respondents saw an explicitly stated conspiracy theory, one that was merely implied, or none at all. We found that explicit cues about rigged voting machines increase belief in such theories, especially when the cues target the opposing political party.
Research note: Tiplines to uncover misinformation on encrypted platforms: A case study of the 2019 Indian general election on WhatsApp
Ashkan Kazemi, Kiran Garimella, Gautam Kishore Shahi, Devin Gaffney and Scott A. Hale
There is currently no easy way to discover potentially problematic content on WhatsApp and other end-to-end encrypted platforms at scale. In this paper, we analyze the usefulness of a crowd-sourced tipline through which users can submit content (“tips”) that they want fact-checked.