Commentary

Beyond the deepfake hype: AI, democracy, and “the Slovak case”

Article Metrics
CrossRef

0

CrossRef Citations

Altmetric Score

29

PDF Downloads

511

Page Views

Was the 2023 Slovakia election the first swung by deepfakes? Did the victory of a pro-Russian candidate, following the release of a deepfake allegedly depicting election fraud, herald a new era of disinformation? Our analysis of the so-called “Slovak case” complicates this narrative, highlighting critical factors that made the electorate particularly susceptible to pro-Russian disinformation. Moving beyond the deepfake’s impact on the election outcome, this case raises important yet under-researched questions regarding the growing use of encrypted messaging applications in influence operations, misinformation effects in low-trust environments, and politicians’ role in amplifying misinformation—including deepfakes.

image by geralT on pixabay

The “Slovak case”

Robert Fico’s victory in the 2023 Slovak parliamentary elections thrust this small Central European country into the global spotlight. Fico’s campaign pledges to oppose sanctions on Russia and end military support for Ukraine were noteworthy enough, but the potential influence of a deepfake truly captured global attention.

Two days before the election, a fake audio clip surfaced purportedly capturing Fico’s main rival, pro-European candidate Michal Šimečka, discussing electoral fraud with a prominent journalist. Although both quickly denied its authenticity, the clip went viral, its impact amplified by the timing just before the election during Slovakia’s electoral “silence period”—a remnant from the era of legacy media, which prohibits media discussion of election-related developments. Šimečka’s loss, despite leading in the polls, fueled speculation about the election being “the first swung by deepfakes” (Conardi, 2023).

The “Slovak case” is now widely seen as the “dawn of a new era of disinformation” (Zuidijk, 2023) and a “test case” (Maeker, 2023) of how vulnerable democratic processes are to AI-driven interference. Casey Newton of Platformer predicted, “[w]hat happened in Slovakia will likely soon occur in many more countries around the world” (2024), while others warned of irreversible consequences: “[t]he deepfake genie is out of the bottle” (Conardi, 2023).

The high stakes demand a nuanced analysis, yet prevailing interpretations fall short. Several observers have cited the Slovak case as proof that images can no longer be trusted as evidence (Harford, 2024), overlooking the historically fraught relationship between truth and media (Paris & Donovan, 2019). Moreover, attributing Fico’s victory to a deepfake downplays critical factors in Slovakia (analyzed below) that prepared the ground for his pro-Putin message.

This is not to underestimate generative AI’s impact on misinformation. Critics of the initial moral panic surrounding fake news may have gone too far in the opposite direction, suggesting that the limited effects paradigm still holds despite technological change (e.g., Garrett, 2019). This paradigm has been rightfully criticized for defining effects so narrowly that outcomes appear marginal (Graves, 2021). Here, mindful that exaggerating misinformation threats is just as dangerous as downplaying them (Belogolova et al., 2024), we limit ourselves to identifying challenging questions raised by the Slovak case that need answering before the implications of deepfakes can be properly understood.

Seeing and believing in the age of AI

Since becoming prominent in the mainstream in 2017, deepfakes have sparked global anxiety about a future where seeing is no longer believing (Rothman, 2018). The Slovak case sparked similar concerns, with the Financial Times asking “how long video evidence will continue to be regarded as trustworthy” (Harford, 2024).

These concerns rest on the misconception that audiovisual content objectively represented reality until disrupted by AI (Paris & Donovan, 2019). This overlooks what Paris and Donovan (2019) call “the politics of evidence” (p. 17), where evidence both shapes and is shaped by cultural, social, and political structures, typically to the powerful’s advantage. The notion that evidence speaks for itself neglects the social work required to transform media into evidence, masking the biases embedded within it.

The advent of a new technology does not fundamentally transform how evidence works but does create new opportunities for negotiating expertise (Paris & Donovan, 2019). Consider the bystander footage of George Floyd’s murder by police officer Derek Chauvin. The official narrative of a medical incident might have stood unchallenged if not for the viral video showing Floyd gasping for air, which sparked global outrage and became crucial evidence in Chauvin’s murder trial (Canon, 2021).

Alarmist reactions to misinformation are not harmless; they may lead to counter-measures that, however well-intentioned, might be exploited to police political debate (Jungherr & Schroeder, 2021). Policymakers and technology companies must do more to combat misinformation (Espinoza, 2024), but one-sidedly framing the new media environment as a threat to democracy—let alone advocating for strict laws to prevent misinformation from “infecting” the public (Nekmat & Yue, 2020)—might result in restrictive policies that erode democratic freedoms.

This is not merely a hypothetical scenario: Recent anti-misinformation laws, including in Western democracies like Germany and Hungary, have weakened protections for independent journalism and compromised access to diverse news (Center for News, Technology & Innovation, 2024). Several governments have criminalized fake news, imposing penalties ranging from fines and suspension of publications to imprisonment. These findings caution against “technopanics” (Marwick, 2008), which often obscure underlying problems and encourage responses that could do more harm than good. 

The first election swung by deepfakes?

The potential for an election upset caused by a deepfake prompted speculation about Šimečka’s loss to Fico despite his apparent lead in the exit polls (Newton, 2024; see also Harford, 2024). This scenario, however, overlooks several factors, not least unwillingness to cooperate with pollsters among distrustful Fico supporters (Búry, 2023), which likely downplayed his support (see Cavari & Freedman, 2023).

If we are to understand Fico’s victory, a wider lens is required. Since at least 2010, several websites and social media profiles have pushed pro-Kremlin narratives, adapting their strategies to stay relevant (Hrabovska Francelova, 2022). Disinformation campaigns went from defending Russia’s annexation of Crimea to denying pro-Russian separatists’ involvement in the MH17 passenger plane crash, then to casting doubt on the integrity of Slovakia’s 2019 and 2020 elections. More recently, they pivoted to promoting Covid-19 conspiracy theories and justifying Russia’s actions in Ukraine. The deepfake’s impact must be understood in the context of these long-term influence operations.

Slovaks’ susceptibility to pro-Russian disinformation must also be considered. Research shows they are among the most conspiracy-minded in Europe (Hrabovska Francelova, 2022) and that a correlation exists between conspiracy beliefs and pro-Russian attitudes: The majority of Slovaks siding with Russia in the Ukraine war believe the world is controlled by powerful groups (Zelinsky, 2024). This demographic also tends to favor authoritarian leadership over liberal democracy (Hajdu & Klingová, 2023), suggesting Fico’s message found a receptive audience.

Another critical factor in the election concerns public trust in Slovakia’s media sector, which was at historical lows before the election (Hajdu & Klingová, 2023). This trust deficit had significant implications, as those who maintained trust in mainstream media—thus generally avoiding alternative outlets known for disseminating disinformation—were 40 percentage points more likely to blame Russia for the war, compared to those skeptical of mainstream outlets, who tended to blame Ukraine (Hajdu & Klingová, 2023).

Trust in public institutions was even lower. This was largely due to domestic and foreign operations to undermine democracy and weaken ties with transatlantic allies, coupled with unstable and chaotic governance marked by political infighting and inability to pass key legislation, culminating in the government’s collapse in December 2022 (Hajdu & Klingová, 2023). Public confidence has risen recently, which critics see as a sign of growing support for the current administration (Hajdu et al., 2024). Yet, before the election, only 18% of Slovaks trusted their government, contributing to over half the population believing disinformation narratives about election manipulation (Hajdu & Klingová, 2023). It is hard to imagine more fertile ground for the deepfake.

The victory of a pro-Russian candidate can also be attributed to long-standing historical and cultural ties, rooted in a perceived pan-Slavic ethnic and cultural heritage, which have influenced Slovak attitudes towards Russia for centuries (Hajdu & Klingová, 2023). Since the 19th-century rise of Slavic nationalism, many Slovaks have viewed Russia as a liberator from Austro-Hungarian rule, and many still do: Almost four in five regard Russia as their traditional Slavic brother, illustrating a deep affinity woven into Slovakia’s cultural and historical fabric (Hajdu et al., 2020).

Support for Russia in Slovakia waned following Putin’s invasion of Ukraine but rebounded before the election as “Ukraine fatigue” took hold (Bond, 2023). This shift is linked to the war’s impact on Slovakia’s energy sector, heavily reliant on Russian supplies (Bounds, 2022), and a high influx of Ukrainian refugees, among the highest per capita in the EU (Sybera, 2023). Disinformation campaigns depicting refugees as stealing from locals helped convince 69% of the population that refugees received preferential treatment, 10 percentage points higher than those favoring increasing aid (Hajdu & Klingová, 2023). Compounding this picture are strong anti-NATO sentiments among Slovaks, which have returned to pre-conflict levels despite regional support for the alliance. Meanwhile, support for EU membership continues to dwindle, with Slovakia now reporting the lowest levels in Central and Eastern Europe.

Considering these factors, the notion that Slovakia’s election was the first swung by deepfakes appears reductive, fixating on the effects of the technology while overlooking the complex social, cultural, and political dynamics that propelled a pro-Russian candidate to victory.

Gaps in misinformation research

Reactions to the Slovak case reflect a broader pattern of alarmism about technology-induced harms (Altay et al., 2023). However, these harms should not be dismissed. The case raises important questions that need to be addressed to understand the influence of this and other deepfakes in future elections.

Those challenging alarmist claims about AI’s impact on misinformation often argue that misinformation has a limited reach (Simon et al., 2023). Several studies support this, showing that the internet is not flooded with fake news (Jungherr & Schroeder, 2021). However, estimating the reach of mis-/disinformation is difficult as it moves from social media platforms to encrypted messaging applications, which conceal communications (Rossini et al., 2021). Emerging evidence indicates that propagandists increasingly exploit applications like WhatsApp and Telegram, drawn to their growing popularity, loose moderation policies, and trust within private networks (Woolley, 2022). This trend is reflected in Slovakia, where Telegram has become a haven for pro-Russia propaganda (Ružičková et al., 2024). Indeed, the deepfake appears to have circulated widely on pro-Fico Telegram channels ahead of the election (Conardi, 2023).

Relatedly, we remain skeptical of narratives that minimize the impact of deepfakes on the grounds that “online misinformation consumption is low in the global north” (Acerbi et al., 2022, p. 2). This view holds that because most people consume content from mainstream sources, improvements in misinformation quality would largely go unnoticed (Simon et al., 2023). However, even if misinformation consumption is limited to a small subset, this group can still wield disproportionate influence over the political system, as seen with QAnon (Guess, 2021). Moreover, studies showing online misinformation consumption is low often focus on a limited range of countries, which are not always representative of the so-called Global North. Given Slovakia’s extremely low levels of news trust (Newman et al., 2024), which is known to prompt people to seek out alternative sources (Jungherr & Schroeder, 2021), misinformation consumption is likely higher than assumed, leaving the country especially vulnerable to high-quality AI-generated disinformation.

Finally, the Slovak case reveals the risks of reifying online misinformation, neglecting its links to elite political rhetoric (Graves, 2021). The extent of online misinformation consumption may actually be of little relevance in Slovakia, where politicians themselves are major spreaders. Take, for instance, a video published by Fico calling Ukrainian soldiers “pure fascists” (2023), which garnered tens of thousands of interactions during the campaign and was picked up by major mainstream outlets (Zoznam, 2023). Another example introduces a new challenge: While Fico remained silent on the deepfake, other prominent politicians shared it (Kőváry Sólymos, 2023), amplifying its potential influence. Much has been said about the “liar’s dividend” (Chesney & Citron, 2019, p. 151), where liars exploit widespread public skepticism to dismiss true information as false, but there is also a risk of politicians presenting deepfakes as genuine, exploiting their realism to promote false information while claiming to believe in their authenticity.

Conclusion

As nearly half of the global population heads to the polls in 2024, observers are calling on lawmakers to protect elections from the dangers of generative AI lest electoral chaos ensue (Elliott & Kelly, 2024). We welcome calls for vigilance as well as ongoing legislative efforts to mitigate online and AI-related harms (Milmo & Her, 2024; O’Carroll, 2023). With technology companies rolling back misinformation policies (Fischer, 2023) and McCarthyite campaigns against disinformation researchers on the rise (Lee Myers & Rutenberg, 2024), action is urgently needed.

However, alarmist narratives attributing existential risks to generative AI are unhelpful. Not only do they encourage quick fixes that risk centralizing control over truth, but they also offer politicians a pretext for casting themselves as saviors, boosting their public image while diverting attention from uncomfortable truths (Orben, 2020).

We have complicated these narratives by identifying overlooked factors in Fico’s victory, such as distrust in institutions and affinity for Russia. A narrow focus on the deepfake misses the role of public demand for Fico’s message, promoting responses that address symptoms rather than root causes.

We also raised critical questions for further research. Efforts should focus on monitoring encrypted messaging applications and examining feedback loops between politicians and online misinformation, especially in low-trust environments. Even if the prospect of electoral chaos seems unlikely, answering these questions is crucial to understand and mitigate the impact of misinformation in the age of AI.

We conclude with a sobering thought: With “Putin’s faction” (Brennan, 2024) gaining ground across Europe and the Republican party turning into a “Putin cult” (Meyerson, 2024), alongside a general decline in media trust (Newman et al., 2024), it is not only the Slovak case that risks being replicated but also the conditions that catapulted Fico to power. We find the latter scenario more troubling, as it suggests our current divisions are less over facts and more over values and worldviews—a far thornier issue to resolve.

Topics
Download PDF
Cite this Essay

De Nadal, L., & Jančárik, P. (2024). Beyond the deepfake hype: AI, democracy, and “the Slovak case”. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-153

Bibliography

Acerbi, A., Altay, S., & Mercier, H. (2022). Research note: Fighting misinformation or fighting for information? Harvard Kennedy School (HKS) Misinformation Review, 3(1). https://misinforeview.hks.harvard.edu/article/research-note-fighting-misinformation-or-fighting-for-information/

Altay, S., Berriche, M., & Acerbi, A. (2023). Misinformation on misinformation: Conceptual and methodological challenges. Social Media + Society, 9(1). https://doi.org/10.1177/20563051221150412

Belogolova, O., Foster, L., Rid, T., & Wilde, G. (2024, May 3). Don’t hype the disinformation threat. Foreign Affairs. https://www.foreignaffairs.com/russian-federation/dont-hype-disinformation-threat

Bond, I. (2023, 21 November). Ukraine fatigue: Bad for Kyiv, bad for the West. Centre for European Reform. https://www.cer.eu/publications/archive/policy-brief/2023/ukraine-fatigue-bad-kyiv-bad-west

Bounds, A. (2022, September 28). Slovakia energy crisis could ‘kill our economy’, premier warns. Financial Times. https://www.ft.com/content/9a94385a-e007-47dd-a238-7add9f7331c7

Brennan, D. (2024, January 17). Europe’s pro-Putin faction is growing. Newsweek. https://www.newsweek.com/europe-eu-pro-putin-faction-sway-growing-orban-fico-ukraine-hungary-slovakia-1861426

Búry, J. (2023, October 1). Urobili sme chybu, priznal riaditeľ Focusu. Prečo sa exit polly tak extrémne líšili od reality? [We made a mistake, admitted the director of Focus. Why were exit polls so wildly different from reality?]. Hospodárske Noviny. https://hnonline.sk/parlamentne-volby-2023/96107507-urobili-sme-chybu-priznal-riaditel-focusu-preco-sa-exit-polly-tak-extremne-lisili-od-reality

Canon, G. (2021, April 21). ‘I cried so hard’: the teen who filmed Floyd’s killing, and changed America. The Guardian. https://www.theguardian.com/us-news/2021/apr/20/darnella-frazier-george-floyd-derek-chauvin-trial-guilty-verdict

Cavari, A., & Freedman, G. (2023). Survey nonresponse and mass polarization: The consequences of declining contact and cooperation rates. American Political Science Review, 117(1), 332–339. https://doi.org/10.1017/S0003055422000399

Center for News, Technology & Innovation (2024). Most ‘fake news’ legislation risks doing more harm than good amid a record number of elections in 2024. https://innovating.news/article/most-fake-news-legislation-risks-doing-more-harm-than-good-amid-a-record-number-of-elections-in-2024/

Chesney, R., & Citron, D. (2019). Deepfakes and the new disinformation war: The coming age of post-truth geopolitics. Foreign Affairs, 98, 147–155.

Conardi, P. (2023, October 7). Was Slovakia election the first swung by deepfakes? The Times. https://www.thetimes.co.uk/article/was-slovakia-election-the-first-swung-by-deepfakes-7t8dbfl9b

Elliott, V., & Kelly, M. (2024, January 23). The Biden deepfake robocall is only the beginning. Wired. https://www.wired.com/story/biden-robocall-deepfake-danger/

Espinoza, J. (2024, April 29). EU to probe Meta over handling of Russian disinformation. Financial Times. https://www.ft.com/content/70dc27e8-07bd-40af-8b0e-d4623a3bcc0b

Fischer, S. (2023, June 6). Big Tech rolls back misinformation measures ahead of 2024. Axios. https://www.axios.com/2023/06/06/big-tech-misinformation-policies-2024-election

Garrett, R. K. (2019). Social Media’s Contribution to Political Misperceptions in U.S. Presidential Elections. PLOS ONE, 14(3). https://doi.org/10.1371/journal.pone.0213500

Graves, L. (2021). Lessons from an extraordinary year: Four heuristics for studying mediated misinformation in 2020 and beyond. In H. Tumber & S. Waisbord (Eds.), The Routledge companion to media disinformation and populism (pp. 188–197). Routledge.

Guess, A. M. (2021). (Almost) everything in moderation: New evidence on Americans’ online media diets. American Journal of Political Science, 65(4), 1007–1022.

Fico, R. (2023, August 29). Where books start burning, people are not burning: The celebrations of the Slovak national uprising must be returned to the honour and glory that belongs to them! [Facebook post]. Facebook. https://www.facebook.com/robertficosk/videos/3628869640722354/

Hajdu, D., Klingová, K., Sawiris, M., & Milo, D. (2020). GLOBSEC trends 2020: Central Europe, Eastern Europe and Western Balkans at the times of pandemic. Globsec. https://www.globsec.org/sites/default/files/2020-12/GLOBSEC-Trends-2020_read-version.pdf

Hajdu, D., & Klingová, K. (2023). Voices of Central and Eastern Europe: Perceptions of democracy & governance in 10 EU countries. Globsec. https://www.globsec.org/sites/default/files/2020-06/Voices-of-Central-and-Eastern-Europe_print-version.pdf

Hajdu, D., Klingová, K., Kazaz, J., Musilová, V., & Szicherle, P. (2024). GLOBSEC trends 2024: CEE – A brave new region? Globsec. https://www.globsec.org/sites/default/files/2024-05/GLOBSEC%20TRENDS%202024.pdf

Hrabovska Francelova, N. (2022, April 6). From vaccination to war: Slovak disinformation outlets quick to shift the conversation. BalkanInsight. https://balkaninsight.com/2022/04/06/from-vaccination-to-war-slovak-disinformation-outlets-quick-to-shift-the-conversation/

Harford, T. (2024, January 25). It’s only a matter of time before disinformation leads to disaster. Financial Times. https://www.ft.com/content/0afb2e58-c7e2-4194-a6e0-927afe0c3555

Jungherr, A., & Schroeder, R. (2021). Disinformation and the structural transformations of the public arena: Addressing the actual challenges to democracy. Social Media + Society, 7(1).
https://doi.org/10.1177/2056305121988928

Kőváry Sólymos, K. (2023). The deepfake video, which interfered in the election campaign, was distributed by both Harabin and Marček. Was it part of a bigger plan? Investigative Centre of Jan Kuciak. https://icjk.sk/280/Deepfake-video-ktore-zasiahlo-do-predvolebnej-kampane-sirili-Harabin-aj-Marcek-Bolo-sucastou-vacsieho-planu

Lee Myers, S., & Rutenberg, J. (2024, April 22). New group joins the political fight over disinformation online. The New York Times. https://www.nytimes.com/2024/04/22/business/media/american-sunlight-project-fight-disinformation.html

Maeker, M. (2023, October 3). Slovakia’s election deepfake show AI is a danger to democracy. Wired. https://www.wired.com/story/slovakias-election-deepfakes-show-ai-is-a-danger-to-democracy/

Marwick, A. E. (2008). To catch a predator? The MySpace moral panic. First Monday, 13(6). https://doi.org/10.5210/fm.v13i6.2152

Meyerson, H. (2024, February 13). The GOP Putin cult. The American Prospect. https://prospect.org/blogs-and-newsletters/tap/2024-02-13-gop-putin-cult/

Milmo, D. & Her, A. (2024, March 14). What will the EU’s proposed act to regulate AI mean for consumers? The Guardian. https://www.theguardian.com/technology/2024/mar/14/what-will-eu-proposed-regulation-ai-mean-consumers

Nekmat, E., & Yue, A. (2020, May 1). How to fight the COVID-19 infodemic: Lessons from 3 Asian countries. World Economic Forum. https://www.weforum.org/agenda/2020/05/how-to-fight-the-covid-19-infodemic-lessons-from-3-asian-countries/

Newman, N., Fletcher, R., Robertson, C. T., Ross Arguedas, A., & Kleis Nielsen, R. (2024). Reuters Institute digital news report 2024. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2024

Newton, C. (2024, January 16). OpenAI makes an election plan. Platformer. https://www.platformer.news/monday-newsletter/

O’Carroll, L. (2023, August 25). How the EU Digital Services Act affects Facebook, Google and others. The Guardian. https://www.theguardian.com/world/2023/aug/25/how-the-eu-digital-services-act-affects-facebook-google-and-others

Orben, A. (2020). The Sisyphean cycle of technology panics. Perspectives on Psychological Science, 15(5), 1143–1157. https://doi.org/10.1177/1745691620919372

Paris, B., & Donovan, J. (2019). Deepfakes and cheap fakes. Data & Society. https://datasociety.net/wp-content/uploads/2019/09/DS_Deepfakes_Cheap_FakesFinal-1-1.pdf

Rossini, P., Stromer-Galley, J., Baptista, E. A., & Veiga de Oliveira, V. (2021). Dysfunctional information sharing on WhatsApp and Facebook: The role of political talk, cross-cutting exposure and social corrections. New Media & Society, 23(8), 2430–2451. https://doi.org/10.1177/1461444820928059

Rothman, J. (2018, November 5). In the age of A.I., is seeing still believing? The New Yorker. https://www.newyorker.com/magazine/2018/11/12/in-the-age-of-ai-is-seeing-still-believing

Ružičková, M., Dubóczi, P. & Haleková, L. (2024). Dezinformácie a propaganda ako biznis: Mapovanie finančného a organizačného pozadia aktérov na slovenskom Telegrame [Disinformation and propaganda as a business: Mapping the financial and organizational background of actors on the Slovak Telegram]. Infosecurity. https://infosecurity.sk/studie/dezinformacie-a-propaganda-ako-biznis-mapovanie-financneho-a-organizacneho-pozadia-akterov-na-slovenskom-telegrame/

Simon, F. M., Altay, S., & Mercier, H. (2023). Misinformation reloaded? Fears about the impact of generative AI on misinformation are overblown. Harvard Kennedy School (HKS) Misinformation Review, 4(5). https://misinforeview.hks.harvard.edu/article/misinformation-reloaded-fears-about-the-impact-of-generative-ai-on-misinformation-are-overblown/

Sybera, A. (2023, October 26). Slovaks question support for Ukraine as fatigue grows. Business News from Eastern Europe. https://www.intellinews.com/slovaks-question-support-for-ukraine-as-fatigue-grows-298615/ 

Woolley, S. C. (2022). Digital propaganda: The power of influencers. Journal of Democracy, 33(3), 115–129. https://doi.org/10.1353/jod.2022.0027

Zelinsky, D. (2024). What is the ratio of supporters and opponents of conspiracies in Slovak society? Institute of Sociology, Slovak Academy of Sciences. https://sociologia.sav.sk/podujatia.php?id=3329&r=1

Zoznam (2023, August 30). Smer si zorganizoval vlastné oslavy SNP: Ľudia skandovali Ficovo meno! Výzva najvyšším ústavným činiteľom [Smer organized its own Slovak National Uprising celebrations: People chanted Fico’s name! A call to the highest constitutional officials]. Topky. https://www.topky.sk/cl/10/2584487/Smer-si-zorganizoval-vlastne-oslavy-SNP–Ludia-skandovali-Ficovo-meno–Vyzva-najvyssim-ustavnym-cinitelom

Zuidijk, D. (2023, October 4). Deepfakes in Slovakia preview how AI will change the face of elections. Bloomberg. https://www.bloomberg.com/news/newsletters/2023-10-04/deepfakes-in-slovakia-preview-how-ai-will-change-the-face-of-elections?embedded-checkout=true

Funding

No funding has been received to conduct this research.

Competing Interests

The authors declare no competing interests.

Copyright

This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided that the original author and source are properly credited.

Authorship

Both authors contributed equally to this article.

Acknowledgements

Peter Jančárik wishes to thank Peter Dubóczi, Jakub Hankovský, Katarína Klingová, Tomáš Kriššák, Kristína Šefčíková, Dominik Želinský, and Slovak Academy of Sciences for their data and insights. Lluis de Nadal wishes to thank his students from the “Social Media, Disinformation and Democracy” sociology course at Glasgow University for invaluable discussions on the topic.