Articles By
David G. Rand

Increasing accuracy motivations using moral reframing does not reduce Republicans’ belief in false news
Michael Stagnaro, Sophia Pink, David G. Rand and Robb Willer
In a pre-registered survey experiment with 2,009 conservative Republicans, we evaluated an intervention that framed accurate perceptions of information as consistent with a conservative political identity and conservative values (e.g., patriotism, respect for tradition, and religious purity). The intervention caused participants to report placing greater value on accuracy, and placing greater value on accuracy was correlated with successfully rating true headlines as more accurate than false headlines.

Examining accuracy-prompt efficacy in combination with using colored borders to differentiate news and social content online
Venya Bhardwaj, Cameron Martel and David G. Rand
Recent evidence suggests that prompting users to consider the accuracy of online posts increases the quality of news they share on social media. Here we examine how accuracy prompts affect user behavior in a more realistic context, and whether their effect can be enhanced by using colored borders to differentiate news from social content.

Digital literacy is associated with more discerning accuracy judgments but not sharing intentions
Nathaniel Sirlin, Ziv Epstein, Antonio A. Arechar and David G. Rand
It has been widely argued that social media users with low digital literacy—who lack fluency with basic technological concepts related to the internet—are more likely to fall for online misinformation, but surprisingly little research has examined this association empirically. In a large survey experiment involving true and false news posts about politics and COVID-19, we found that digital literacy is indeed an important predictor of the ability to tell truth from falsehood when judging headline accuracy.

Happiness and surprise are associated with worse truth discernment of COVID-19 headlines among social media users in Nigeria
Leah R. Rosenzweig, Bence Bago, Adam J. Berinsky and David G. Rand
Do emotions we experience after reading headlines help us discern true from false information or cloud our judgement? Understanding whether emotions are associated with distinguishing truth from fiction and sharing information has implications for interventions designed to curb the spread of misinformation.

Developing an accuracy-prompt toolkit to reduce COVID-19 misinformation online
Ziv Epstein, Adam J. Berinsky, Rocky Cole, Andrew Gully, Gordon Pennycook and David G. Rand
Recent research suggests that shifting users’ attention to accuracy increases the quality of news they subsequently share online. Here we help develop this initial observation into a suite of deployable interventions for practitioners. We ask (i) how prior results generalize to other approaches for prompting users to consider accuracy, and (ii) for whom these prompts are more versus less effective.

Research note: Examining false beliefs about voter fraud in the wake of the 2020 Presidential Election
Gordon Pennycook and David G. Rand
The 2020 U.S. Presidential Election saw an unprecedented number of false claims alleging election fraud and arguing that Donald Trump was the actual winner of the election. Here we report a survey exploring belief in these false claims that was conducted three days after Biden was declared the winner.

Emphasizing publishers does not effectively reduce susceptibility to misinformation on social media
Nicholas Dias, Gordon Pennycook and David G. Rand
Survey experiments with nearly 7,000 Americans suggest that increasing the visibility of publishers is an ineffective, and perhaps even counterproductive, way to address misinformation on social media. Our findings underscore the importance of social media platforms and civil society organizations evaluating interventions experimentally rather than implementing them based on intuitive appeal.