Commentary

Repress/redress: What the “war on terror” can teach us about fighting misinformation

Article Metrics
CrossRef

0

CrossRef Citations

Altmetric Score

11

PDF Downloads

PDF downloads since July 10, 2023
3039

Page Views

Misinformation, like terrorism, thrives where trust in conventional authorities has eroded. An informed policy response must therefore complement efforts to repress misinformation with efforts to redress loss of trust. At present, however, we are repeating the mistakes of the war on terror, prioritizing repressive, technologically deterministic solutions while failing to redress the root sociopolitical grievances that cultivate our receptivity to misinformation in the first place.

Image by Levi Clancy on Unsplash

Introduction

As the COVID-19 pandemic rages across the globe, societies have simultaneously been buffeted by an “infodemic” of conspiracy theories and phony medical advice, seemingly peddled by pamphleteers and presidents alike. As with the virus, the manner in which societies choose to address the phenomenon of misinformation may have consequences for decades to come. In this essay, we argue that the Western policy response to misinformation thus far shares a troubling affinity to its response to terrorism a generation ago, in the wake of 9/11. Superficially, of course, the nascent war on misinformation looks nothing like the war on terror when comparing the sheer scale of violence and destruction. More abstractly, however, the Western policy responses across both campaigns share a common denominator, namely, a reflexive tendency to see both terrorism and misinformation as nuisance phenomena that should be repressed, rather than symptoms of underlying sociopolitical maladies that should be redressed. Like terrorism, misinformation is seen not as endemic, but as an aberration foisted upon society from the outside, whether by malicious foreign actors or domestic misfits. In reality, however, terrorism is often symptomatic of frustration with political authorities, and misinformation likewise thrives where trust in authorities has eroded. Consequently, we predict that the war on misinformation, like the war on terror, will substantially be a war on symptoms — protracted and futile. In tandem with palliative measures, we advocate for a concerted effort to redress root grievances and restore faith in our political system.

Two sides of the COIN

After the devastating attacks on New York and Washington DC on September 11, 2001, the United States and her Western allies declared a global “war on terror,” aiming not only to prevent further attacks on American and European soil, but to seek and destroy terrorist safe havens in the anarchic peripheries of failed states across western Asia. Wrathful and resolute, the Western coalition initially adopted a predominantly repressive approach to counterinsurgency (COIN), seeking to identify, kill, arrest, render, deport, and freeze the assets of suspected terrorists: in short, to deny them the “means” of attacking.

This repressive, means-denying approach was consistently favored over its theoretical alternative, the “hearts and minds” approach to COIN, which aims to redress the political grievances motivating citizens to sympathize with, support, and give succor to the terrorists. Indeed, months before 9/11, a University of California political science professor published a book, Blowback, in which he predicted that American empire-building activities abroad were stoking hatred and resentment that would soon translate into retaliatory violence (Johnson, 2000). His prescient remarks were ignored not only before, but also after the twin towers fell, and far into the Afghanistan and Iraq campaigns. Whether in official state missives or Hollywood dramatizations, the persistent question on the minds of many Americans, “why do they hate us?”, continued to be answered by othering the terrorists as irrational fanatics. Whether calling itself Al-Qaeda or the Taliban or Hamas or Hezbollah or ISIS, the supposed ideological intransigence of this ubiquitous enemy implied that negotiations were impossible across the board; a military solution was the only option.

Interestingly, it was the military (specifically, the United States Marine Corps) that first recognized the inadequacy of this repression doctrine. Starting in 2007, during the occupation of Iraq, the Marines updated their field manual to espouse a more “hearts and minds” approach (Nagl et al., 2008). At the time, Iraqi insurgents were waging an effective urban guerilla war that hinged on the complicity of Iraqi civilians. If the Marines could win over the population, so the theory went, the civilians would reciprocate by ratting out the insurgents. Sure enough, a series of neighborhood-level development initiatives did indeed seem to curry favor with Iraqi civilians and coincided with localized drawdowns in insurgent violence (Berman et al., 2011).

Though an important advancement in American military doctrine, casting the “hearts and minds” approach as a counterinsurgency strategy misses the larger intellectual departure. Addressing citizens’ political grievances is not really counterinsurgency, it is just plain politics. Indeed, in Iraq, the American military was all too uncomfortably aware that it was being asked to fill not only a security vacuum but also a political one (Schadlow, 2017). To shift from a repressive, means-denying approach to a redressive, motives-oriented approach is to shift from plotting military operations to doing community outreach. Instead of burdening the military with the task of tracking down and killing terrorists, it would be up to civilian policymakers to work with marginalized communities to help redress their grievances so that they do not become hotbeds of extremism. In the decades after 9/11, however, American policymakers largely did the exact opposite, alienating (Razack, 2008) and stigmatizing (Marzouki, 2017) Muslims at home while ramping up drone strikes and prolonging controversial and destabilizing military occupations abroad. By casting the phenomenon of terrorism in strictly martial terms, the previous generation of policymakers “securitized” what was substantively a political matter. Instead of exhausting all redressive options before turning to repression, policymakers operated in reverse, treating politics as something one resorts to only after violence fails. Instead of redressing the root causes of political discontent, they mandated the military to prosecute an unwinnable global war on its symptoms — with devastating consequences for life and liberty both abroad and at home.

From “weaponized” social media to securitized information

Our present response to misinformation exhibits this same attitude of “politics as a last resort.” Like terrorism, misinformation is predominantly being characterized by its perpetrators and its means of perpetration — with Russia, China, and Iran as the “bad actors,” and Facebook, Twitter, and other large technology companies, their willing partners at worst and ignorant patsies at best. Renee DiResta argued in a 2018 WIRED piece that responding to information warfare was a cybersecurity issue and that our priorities should be to identify and eliminate influence campaigns (DiResta, 2018). Kara Swisher of Recode suggested in the New York Times that, when Sri Lanka blocked social media following the emergence of hoaxes and rumors in the wake of the tragic Easter bombings in 2019 (McCurry, 2019), that it was a “good” thing (Swisher, 2019). And in the UK government’s 2019 Online Harms White Paper (DCMS, 2020), the solutions proposed were overwhelming in favor of content deletions, ISP-level blocking, and criminal liability with minimal language spent on safeguarding freedom of expression. In other words, target the creators and block the means of dissemination.

As with the war on terror, a consequence of the Global North’s repressive, means-denying war on misinformation is the rhetorical cover it offers for the erosion of civil liberties across the Global South. Leaders with authoritarian proclivities have seized upon the vocabulary of fighting “fake news” to enact censorship-enabling legislation (Mchangama & McLaughlin, 2020), intimidate and harass journalists (Islam, 2018), and increase surveillance (Cushing, 2019). Following a series of deadly mob-fueled lynchings (Madrigal, 2018) based on rumors circulating on WhatsApp, India has proposed legislation (Newton, 2020) that would compel tech platforms to hand over information without a court order or warrant, requiring that any post be “traceable” to its origin—essentially forcing tech companies to weaken or break encryption altogether. Already plagued by severe safety concerns and security constraints, journalists in Egypt are being arrested at an alarming rate for purportedly/allegedly spreading false news (Open Technology Fund, 2019). The government, in an attempt to crack down on the spread of misinformation, has established a “rumor collection network” — turning neighbors and citizens into informants for Egypt’s expanding surveillance state. Meanwhile, Nigerian activists and journalists are also sounding the alarm for a proposed bill that has been described as an “attempt to gag the media” (Rozen, 2020). The senator co-sponsoring the bill, Mohammed Sani Musa, told the independent non-profit Committee to Protect Journalists that the bill was “guided by online controls in other jurisdictions,” namely Singapore, the U.K., and the EU (ibid.).

It’s not post-truth; it’s post-trust

A redressive, motives-oriented approach to handling misinformation, by contrast, would begin by acknowledging the many deep-rooted sociopolitical grievances that cultivate citizens’ receptivity to falsehoods and misrepresentations. It has been said that we live in a post-truth world (Oxford Languages, 2016), where facts are less influential than appeals to emotion and personal belief with regards to shaping public opinion. It might equally be argued that we live in a post-trust world. Caught off-guard by Trump’s electoral triumph in 2016, liberal commentators quickly seized upon allegations of Russian misinformation as their scapegoat; yet the wave of populist ire that carried Trump to the White House was long in the making. Decades of wage stagnation, rising student debt and crumbling infrastructure, coupled with the boorish anti-intellectualism of outlets such as Fox News, have fostered a climate of disbelief and institutional distrust among Americans, while lending greater credence to peripheral voices. As media and communication scholar Johan Farkas has noted, the decline in democracy was not precipitated by social media but had been in progress for quite some time already (Farkas, 2019).

Now, as the pandemic turns compliance with authorities into a life-or-death decision, institutional decay and erosion of credibility are laid bare. The fact that the CDC must compete with conspiracy theorists online indicates just how far things have declined. In an illuminating presentation at MIT’s recent conference on Exploring Media Ecosystems, José Cansado, a medical misinformation researcher, described how anti-vaxxer rhetoric tends to feed off citizens’ disillusionment and distrust of profit-hungry “Big Pharma” — and the medical authorities suspected to be in their pockets. Indeed, a recent study based on a survey of nearly 2,500 Americans conducted during a measles outbreak found that trust in medical authorities was the strongest single predictor of an individual’s stance on vaccinations (Stecula et al., 2020).

In the Global South, the erosion of credibility is also manifest. Last month, on state-run television, the Egyptian minister of public health incorrectly claimed that the coronavirus can be neutralized with antibiotics (Al Jazeera Egypt, 2020). Egyptians can hardly be blamed if they turn to alternative information sources. Indeed, in Malaysia, following decades of government interference and outright ownership of mainstream media outlets, citizens have drastically shifted to online sources for news, including social media, where false and misleading information abound (Nain, 2018).

In short, citizens across the globe are understandably disillusioned with and distrustful of conventional authorities. Just as terrorists capitalize on this, so too do conspiracy theorists, trolls, charlatans, and so on. Indeed, there need to be some legislative and technological guardrails put in place to prevent the rampant spread of misinformation. Clear and consistent enforcement of terms of service, increased transparency around campaign spending, and the removal of harmful content, such as online harassment and doxing, are needed. Algorithmic transparency and accountability are also important in surfacing how misinformation or other harmful content is spread and, in some cases, monetized. However, in addition to detection and deletion, we need a complementary redressive approach, seeking to slow and reverse people’s loss of trust in the center while restoring inclusive political institutions and accountable authorities. After all, misinformation tactics ephemerally shift and change (Lim et al., 2019). Political grievances, by contrast, run deep and are easier to build a sustained strategy around. If, for example, Russia ends up exploiting the marginalization of African Americans in order to sow domestic discord, as suggested in Thomas Rid’s New York Times op-ed (Rid, 2020), then the policy response should be twofold. Yes, we should detect and uproot Russian propaganda infrastructure where practicable. But policymakers should also seek to redress the mistreatment and social injustice that made audiences receptive to such propaganda in the first place. 

Fortunately, there is much more room for improvement and the fight against misinformation is still nascent. If we can address the social layer along with the technical layer, and formulate more holistic policies that consider both repressive and redressive measures in tandem, we may avoid the pitfalls of the war on terror. For example, some great strides have been made recently as research that highlights the social conditions leading to misinformation have gained more attention and traction. Jonathan Corpus Ong and Jason Cabañes’ ethnographic work in the Philippines, for example, illuminates the industry incentives as well as the sociopolitical and legislative foundations that drive disinformation (Ong & Cabanes, 2018). Recent research on vaccine hesitancy illustrates how loss in trust of medical experts is the biggest driver of anti-vaxx beliefs, which may inform how medical professionals communicate with their patients (Stecula et al., 2020). And Brandi Collins-Dexter from Color of Change illustrates why historical and ongoing racial injustice has led to potentially dangerous and false statements to proliferate among Black online communities (Collins-Dexter, 2020). The authors of this article fully admit that these are not easy challenges, and that the road ahead will be a long one requiring a wide range of options to address the diversity of misinformation proliferating online. There will not be a one-size-fits-all solution— addressing why people flock to health misinformation, for example, will be extremely different from extremist political disinformation. However, to disregard the reasons that give rise to an individual orgroup’s receptivity to harmful misinformation while preferencing technological solutions will risk infringement of civil rights while ignoring both real and perceived sociopolitical injustice.

Topics
Download PDF
Cite this Essay

Abrahams, A., & Lim, G. (2020). Repress/redress: What the “war on terror” can teach us about fighting misinformation. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-032

Bibliography

Al Jazeera Egypt [@AJA Egypt]. (2020, March 8). “لا داعي للقلق”.. وزيرة #الصحة تقول بتصريحات تلفزيونية إن الحالات المصابة بفيروس #كورونا لا تحتاج جميعها إلى مستشفى ويمكن علاجها بخافض للحرارة ومضادات حيوية. [Tweet; image]. Twitter. https://twitter.com/AJA_Egypt/status/1236644934909902848

Berman, E., Shapiro, J. N., and Felter, J. H. (2011). Can hearts and minds be bought? The economics of counterinsurgency in Iraq. Journal of Political Economy, 119(4), 766-819. https://doi.org/10.1086/661983

Collins-Dexter, B. (2020). Canaries in the coal mine: COVID-19 misinformation and black communities. Harvard Kennedy School Shorenstein Center. https://doi.org/10.37016/TASC-2020-01

Cushing, T. (2019, June 4). Singapore’s fake news law is also an internet surveillance law. TechDirt. https://www.techdirt.com/articles/20190603/18492642323/singapores-fake-news-law-is-also-internet-surveillance-law.shtml

DiResta, R. (2018, August 3). The information war is on. Are we ready for it? Wired. https://www.wired.com/story/misinformation-disinformation-propaganda-war/

Farkas, J. (2019, December 10). The fight against fake news is a greater threat to democracy than fake news itself. Malmö University News. https://mau.se/en/news/thefightagainstfakenews/

Islam, S. (2018, December 18). In ‘fake news’ crackdown, Egypt is a world leader on jailing journalists, bloggers and social media users. Los Angeles Times. https://www.latimes.com/world/la-fg-egypt-fake-news-arrests-20181218-story.html

Johnson, C. (2000). Blowback: The Costs and Consequences of American Empire. Macmillan.

Lim, G., Maynier, E., Scott-Railton, J., Fittarelli, A., Moran, N., & Deibert, R. (2019, May 14). Burned after reading: Endless mayfly’s ephemeral disinformation campaign. The Citizen Lab. https://citizenlab.ca/2019/05/burned-after-reading-endless-mayflys-ephemeral-disinformation-campaign/

Madrigal, A. C. (2018, September 25). India’s lynching epidemic and the problem with blaming tech. The Atlantic. https://www.theatlantic.com/technology/archive/2018/09/whatsapp/571276/

Marzouki, N. (2017). Islam: An American religion. Columbia University Press.

McCurry, J. (2019, April 22). Sri Lanka terrorist attacks amongst world’s worst since 9/11. The Guardian. https://www.theguardian.com/world/2019/apr/22/sri-lanka-terrorist-attacks-among-worst-world-911

Mchangama, J., & McLaughlin, S. (2020, April 1). Coronavirus has started a censorship pandemic. Foreign Policy. https://foreignpolicy.com/2020/04/01/coronavirus-censorship-pandemic-disinformation-fake-news-speech-freedom/

Nagl, J. A., Amos, J. F., Sewall, S., & Petraeus, D. H. (2008). The US Army/Marine Corps Counterinsurgency Field Manual. University of Chicago Press.

Nain, Z. (2018). Malaysia. Reuters Institute Digital News Report. http://www.digitalnewsreport.org/survey/2018/malaysia-2018/

Newton, C. (2020, February 14). India’s proposed internet regulations could threaten privacy everywhere. The Verge. https://www.theverge.com/interface/2020/2/14/21136273/india-internet-rules-encryption-privacy-messaging

Copyright

This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided that the original author and source are properly credited.