Peer Reviewed

Taking the power back: How diaspora community organizations are fighting misinformation spread on encrypted messaging apps

Article Metrics
CrossRef

0

CrossRef Citations

Altmetric Score

11

PDF Downloads

169

Page Views

We applied a mixed-methods approach with the goal of understanding how Latinx and Asian diaspora communities perceive and experience the spread of misinformation through encrypted messaging apps in the United States. Our study consists of 12 in-depth interviews with leaders of relevant diaspora community organizations and a computer-assisted content analysis of 450,300 messages published on Telegram between July 2020 and December 2021. We found evidence of cross-platform misinformation sharing, particularly between Telegram, WhatsApp, and YouTube. The enclosed nature of encrypted messaging applications makes them a testing ground for misinformation narratives before these narratives are sent out to open platforms. Finally, YouTube is a central component of misinformation spread because much of the misinformation content spread in these communities is video-based.

IMAGE BY MOHAMED_HASSAN ON PIXABAY

Research Questions

  • How do diaspora community organizations perceive the spread of misinformation on encrypted messaging apps within their communities in the United States?
  • Does a content analysis of the misinformation channels identified by these leaders confirm their perceptions of the spread of misinformation?
  • How have diaspora community organizations responded to the misinformation spread on encrypted messaging apps?
  • What do diaspora community organizations need to respond to misinformation spread on encrypted messaging apps?

Research note Summary

  • We conducted interviews with leaders from diaspora communities (N = 12) who worked in civil society organizations dedicated to the advancement of Latinx and Asian communities in the United States to identify 1) how diaspora community organizations perceive and respond to misinformation spread and 2) what are their immediate needs in order to fight this misinformation spread. Then, we added a computer-assisted content analysis to quantitatively assess these organizations’ perceptions of the volume and content of misinformation spread within their communities.
  • We found evidence of cross-platform misinformation sharing, especially between encrypted messaging apps (such as Telegram and WhatsApp) and YouTube. Additionally, we found that political actors are taking advantage of the encrypted nature of these applications to test narratives before publishing misinformation to open platforms. Thus, monitoring these messaging applications may help journalists, fact checkers, and community organizations anticipate content that will be published on other platforms, which may accelerate fact-checking initiatives. Additionally, YouTube is a central component of misinformation spread among diaspora communities, which makes it important to create more effective tools to monitor this platform in search of misinformation content.
  • Our findings also point to evidence about the direction of global flows of misinformation. Participants mentioned that Latin American countries seem to adapt misinformation narratives spread by the United States, while members of Latin American diaspora communities receive misinformation about U.S. politics emanating from the countries of origin of these members.
  • We found that several diaspora community organizations are already developing initiatives against misinformation, with the use of large-scale volunteer teams, fact-checking websites, and social listening tools. To do so, these organizations need targeted media literacy initiatives for their sociocultural, linguistic, and political contexts.

Implications

An extensive body of research has investigated political misinformation within the United States to the extent that researchers have published comprehensive reviews about the state of this field (Guess &  Lyons, 2020; Kapantai et al., 2021; Tandoc, 2019). However, there is little scholarship about misinformation targeting diaspora communities in the United States through encrypted messaging apps (EMAs). To fill the knowledge gap regarding the extent of the spread of misinformation among diaspora communities living in the United States, we explore how leaders of diaspora community organizations understand, experience, and work to combat political misinformation. Our goal is to bring insight directly from these communities to co-design infrastructures that are able to identify and address these communities’ needs for combating misinformation spread through EMAs.

In the United States, political actors heavily target diaspora communities for misinformation campaigns, especially through encrypted messaging applications (Riedl et al., 2022). EMAs enable the spread of misinformation by way of their end-to-end encryption, social media features (e.g., forwarding messages, videos, and images), and relationship with a user’s network of phone number contact lists (Gursky et al., 2021). For instance, during the 2020 presidential election, political misinformation was spread at high rates on WhatsApp among Latinx communities (Mazzei & Medina, 2020). Several studies demonstrated the role EMAs play in spreading misinformation in countries where EMAs are the most used social media applications, such as India and Brazil (Garimella & Eckles, 2020; Newman et al., 2020; Ozawa et al., 2023).

Our research yielded evidence of cross-platform misinformation sharing, particularly between EMAs (especially Telegram and WhatsApp) and YouTube. It is critical to understand the interplay between these platforms—which participants say are especially important to diaspora communities—to combat misinformation more effectively. We also found evidence that EMAs are a testing ground for misinformation. As we will describe in more detail in the Findings section, several diaspora community leaders have the perception that political actors take advantage of the enclosed nature of EMAs to spread false narratives and test their audience’s reaction. That way, misinformation creators can identify if the narratives will gain traction among the public with less chance of being recognized. These findings indicate that community-supported monitoring of EMAs—via tools like WhatsApp Monitor (Resende et al., 2018)—may help journalists, fact-checkers, and community organizations to anticipate content that will be published on other platforms and accelerate fact-checking initiatives.

According to our participants, several diaspora community organizations are already developing initiatives against misinformation. These efforts and organizations need a workforce that understands the socio-cultural and political context in which diaspora-targeting misinformation narratives flow. Investors and policymakers should allocate resources to promote such initiatives. Similarly, these communities need resources for media literacy initiatives that dialogue with specific socio-cultural and political contexts. Therefore, educators who develop media literacy endeavors should be aware of the peculiarities of how diaspora communities are impacted by misinformation. This would allow communities to build initiatives that address topics, social media use patterns, and political issues pertinent to these communities. Comparably, policymakers should create public solutions that consider the intrinsic characteristics of these communities. In summary, there is no one-size-fits-all solution when it comes to addressing the misinformation problem in a country as diverse as the United States. Our results contribute to confirming findings in recent literature (Austin et al., 2021; Kemei et al., 2022; Lee et al., 2023; Nguyễn et al., 2022).

Most of the people that we interviewed lead organizations that were not originally created to combat misinformation. Rather, these efforts grew in response to the increasing influence campaign experienced by their communities. Analyzing these efforts brings insight into how other organizations can reproduce similar initiatives and results. For instance, our participants describe grassroots initiatives for spreading accurate information tailored to particular communities and contexts. Similarly, they mention creative solutions in favor of nuanced media literacy, including the use of podcasts and targeted advertising. There are several aspects of the initiatives described in this paper that could apply to the U.S. general public and to broader efforts to combat misinformation.

Investigating the content spread among diaspora communities, we found evidence that there is a bidirectional path regarding the transmission of misinformation between countries. Our findings show that U.S. narratives such as QAnon are contaminating political discussions in Latin America. At the same time, our findings indicate that diaspora members in the United States are receiving misinformation about U.S. politics emanating from the countries of origin of these members. This result has implications for studies about information flows around the globe because it shows that misinformation content may also be spread in a form of asymmetrical interdependence among countries instead of a one-way flow of cultural imperialism (Straubhaar, 1991). As explained by Straubhaar (1991), asymmetrical interdependence refers to the varying degrees of power in countries’ relationships, such as their influence on culture. This concept stands in contrast to cultural imperialism, which posits the domination of culture by the United States and other so-called First World nations. Relatedly, particularly worrying is our evidence demonstrating the ways in which international actors may influence politics in the United States by targeting diaspora community members.

While we have seen recent examples of foreign attempts to influence communities of color in the United States (Bastos & Farkas, 2019; Freelon & Lokot, 2020; Riedl et al., 2022), our research sheds new light on how state interference can potentially impact intersectional diaspora communities. We share evidence that suggests international political actors might have the capacity to circumvent fact-checking mechanisms in the United States by specifically targeting diaspora communities, especially if these communities lack fact-checking resources addressing their own sociocultural and political contexts and produced in their native languages. This evidence has implications for those interested in combating misinformation aimed at influencing U.S. politics and, more specifically, marginalized communities in the United States.

Recent research shows that political misinformation was spread among diaspora communities during the 2020 and 2022 U.S. elections (Austin et al., 2021; Reddi et al., 2021; Riedl et al., 2022). We are still far from effectively addressing the misinformation problem, especially among the communities analyzed in this study. As our findings show, perceptions of the community organizations’ leaders bring important insights into this still unexplored environment of misinformation spread. For instance, these insights are about specific topics targeting these communities, how specific platforms are used for misinformation spread, the original ways in which they have been responding to misinformation spread, and their needs for this response. Our findings add scholarly evidence to journalistic accounts of misinformation spread among diaspora communities in the United States (Mazzei & Medina, 2020). The identification of this phenomenon is extremely important for civil society and political actors because recognizing an issue is the first step in order to tackle it.

Findings

Finding 1: Diaspora community leaders perceive encrypted messaging apps as platforms for political actors to test misinformation narratives.

Our findings suggest that misinformation content on EMAs was prolific in discussions of the 2020 U.S. election and COVID-19.  One common tactic was attacking then-presidential candidate Joe Biden via false claims related to his political leanings. Participants consistently acknowledged the prominence of misinformation content portraying Biden and Democrats, as well as progressives, generally, as socialists and communists. In the context of the Cuban-American and Venezuelan diaspora communities, allegations about socialism and communism bear particular weight. Participants also described the challenges their communities faced with disenfranchising misinformation. They and their organizations’ members encountered messages alleging election fraud—but also misleading information aimed at voters, the mail-in ballot process, and the election process in general. Many grassroots organizations contended with politicized misinformation about COVID-19 vaccines, masks, and the origin of that virus.

All of our participants noted that YouTube plays a central role in the spread among diaspora communities because much of the misinformation content spread in these communities is video-based. Ultimately, participants said that EMAs played a particularly decisive role in information (and misinformation) sharing in their various communities. They say misinformation was spread at high rates through WhatsApp, Telegram, or WeChat (depending on which app was most adopted by the community). Several of our participants noted that the enclosed, private nature of EMAs makes them a safer place for misinformation to spread, making them a testing ground for misinformation posts before sending them out to open platforms. For instance, participants pointed out that WhatsApp “is the kind of place to experiment” and that political actors can “go a little further and promote more harmful narratives” through EMAs. In other words, misinformation creators use EMAs to test new narratives and check the public’s reaction to those narratives. Furthermore, political actors can spread more harmful narratives without fear of being recognized.

Several of our participants also commented on a cross-border flow of misinformation among different countries. In our sample, the flow of misinformation seems to be bidirectional between the United States and countries in Latin America. One participant who monitors misinformation in online Latinx channels mentioned: “There’s quite a bit of Colombian YouTube accounts often talking about U.S. politics.” At the same time, misinformation narratives created in the United States were adapted for different countries. For instance, one community leader mentioned that QAnon channels were created in Venezuela, Peru, and Colombia.

Computational analysis of Telegram groups

Several of our participants from Latinx communities mentioned specific influencers and EMA groups responsible for spreading misinformation within their communities. Given this input, we analyzed 24 groups on Telegram directly related to information and particular keywords mentioned in our interviews, with the goal of quantitatively confirming perceptions of the organizations’ leaders. More specifically, we analyzed groups that spread misinformation targeting these communities: Cuban, Colombian, Argentinian, Peruvian, and Venezuelan. We created monthly counts to assess the volume of Telegram messages published between July 2020 and December 2021. The high volume of content spread during the period of the 2020 U.S. election confirmed statements from the participants. Notably, the highest amount of misinformation occurred after January 6th, the day of the Capitol attack.

Our 20-topic model confirmed the interview findings (see Figure 1 in the Appendix). The topic model allowed us to identify the major themes discussed in our large Telegram dataset, composed of 450,300 messages. Each of the 20 graphs in Figure 1 shows words that were clustered together among all the messages within the dataset. The most significant words in each cluster are listed at the top of each graph. After a qualitative analysis of each graph, we were able to find two misinformation topics that were regularly mentioned in our interviews: U.S. politics (mainly messages about election fraud and Joe Biden) and COVID-19 (misinformation about vaccination). Although we anonymized the names of the Telegram groups due to privacy concerns, our participants indicated groups where influencers and organizations spread misinformation. Additionally, we found topics discussing COVID-19, as well as topics linking keywords such as Biden and [George] Soros, and [Donald] Trump and [election] fraud. Lastly, we confirmed the cross-platform trend, with several messages linking to content outside of Telegram, particularly YouTube, Facebook, Twitter, and Instagram.

Finding 2: Diaspora community organizations use large-scale volunteer teams, fact-checking websites, and social listening tools.

Even though they were not originally devoted to this cause, many organizations decided to fight back against misinformation, as they see it increasingly targeting their communities. The most widespread action among these organizations is to train large-scale volunteer teams to help identify and curb misinformation in their communities. These efforts are done both digitally and in person.

Our participants believe that having people from their own communities spread accurate information brings legitimacy to the content. One participant mentions: “We think that that needs to come from a place of trust. These people need to be well respected in their community. We go through the process of training folks using text messaging scripts and being able to have a flowchart of how to respond to various pushbacks.” Interestingly, some organizations draw from public health efforts in rural regions: “They use [volunteers] a lot in rural regions, where there are not enough health care workers. So, we kind of borrowed from that concept, distributing content to the community.”

To gauge misinformation narratives spreading through their communities, organizations used fact-checking websites and social listening tools. Then, these organizations published informative content through YouTube channels, online influencers, podcasts, and proprietary fact-checking websites. One of the organizations uses advertising on social media to promote fact-checked content, targeting the audience of misinformation outlets. However, participants pointed out that the overwhelming amount of misinformation spread makes it “hard to keep up.” Thus, workshops on media literacy are central to most of these communities, which leaders tend to see as more effective than debunking. One participant illustrated this idea, stating that “in order for people to listen to the truth, they need to know how to find the truth in the first place.”

Finding 3: Diaspora community organizations need fact-checked content translated into their native languages and initiatives that engage with their specific contexts.

The need for translations of fact-checked content was consistent across almost all interviews. One participant expressed that this is “especially important in the South Asian diaspora, where you have dozens of languages that could be spoken.” Another participant describes how users who looked up the term “pandemic” on social media apps in English would be guided to a COVID-19 information center. Simultaneously, if someone were to look up the same term in Spanish, the apps “actually just send [the user] to a series of conspiracy theories, websites, and groups.” Moreover, leaders suggest that fact checkers should publish content through “central points” so that fact-checking efforts do not get lost.

Given that YouTube and EMAs are central to misinformation spread within these communities, leaders pointed out the urgency of monitoring tools designed for these platforms. These tools would enable faster fact-checking initiatives. One leader mentioned that misinformation tends to spread through hour-long videos on YouTube, which makes monitoring the platform “very time-consuming.” Also, these videos are created at a fast pace. For instance, one participant says that “these channels seem to have coordinated content being created with subtitles in Spanish.” According to the leader, when Tucker Carlson airs on Fox News, videos with Spanish subtitles are immediately uploaded to YouTube channels. The volume and speed of misinformation are also a problem on EMAs. Leaders expressed the need for a tool that could “parse those messages,” ideally through keywords. It would be especially important to have a tool that could identify trending content “by region or by state.”

Several leaders mentioned that their organizations need more volunteers and organizations with similar initiatives devoted to combating misinformation. Importantly, these teams need to “understand the cultural and the political context” where these communities come from. Tools to measure success and knowledge gain would be essential for these educational initiatives. For instance, one participant noted a need to distribute large-scale surveys to measure the success of their media literacy campaigns.

Methods

We applied a mixed-methods approach consisting of in-depth interviews and computer-assisted content analysis. We first conducted semi-structured, in-depth interviews with leaders from diaspora communities (N=12) from September 2020 to September 2021. Our main goal was to bring perspectives from the two largest groups of immigrants in the United States: Asian and Latinx (Budiman, 2020). The leaders work in civil society organizations dedicated to the advancement of Chinese (N = 2), Arab (N = 1), Colombian (N = 5), Argentinian (N = 2), and Indian (N = 2) communities. Thus, some of these civil society organizations work with broader communities and others with immigrants from specific countries; their scope of work in the United States also varies between national and state levels. Therefore, we gathered a high-impact sample of participants that offered a privileged perspective from within the communities that this study investigated. Furthermore, these leaders were involved with direct action against political misinformation and, therefore, could provide unique insight into how these communities were coping with the problems that this study explores. We initially recruited five participants who were featured in journalistic reports about misinformation targeting diaspora communities in the United States. Then, we gathered the remaining seven participants using snowball sampling based on recommendations from the initial five participants. Participants were six women and six men, ranging between 30 and 60 years of age. Interviews were conducted via Zoom, lasted 40 to 60 minutes, and were anonymized. We used a grounded theory approach (Glaser & Strauss, 1967) to identify general themes. More specifically, we first analyzed the interviews’ memos and transcripts through open coding. Then, we established axial codes that synthesized our open codes. Afterward, we selectively defined general themes that synthesized axial codes and brought the theoretical findings described in this article.

For the quantitative part of this study, we collected 450,300 messages published between July 2020 and December 2021 in 24 public Telegram groups that had previously shared political misinformation among Latinx communities. The community leaders that we interviewed recommended these groups by mentioning influencers and EMA groups who recognizably spread misinformation. We analyzed Telegram data instead of other EMAs (such as WhatsApp) because Telegram allows the user to search for public groups based on keywords—a functionality not available on WhatsApp. We first ran descriptive analytics and then used a Latent Dirichlet Allocation (LDA) model to build a 20-topic model, with the goal of exploring the main themes discussed in the Telegram groups. Topic modeling is an unsupervised machine-learning technique designed to identify the most statistically representative terms that are clustered together in a dataset (Lukito & Pruden, 2023). For privacy protection, the names of the groups and all the messages were anonymized.

Future studies should benefit from tracking the spread of misinformation narratives across different social media platforms, especially given our finding that EMAs are used as a testing ground for misinformation narratives. It would be important to assess this finding quantitatively and longitudinally, which researchers could potentially do through tools such as the WhatsApp Monitor (Resende et al., 2018). This tool exhibits content that is trending within public chat groups, making it easier to identify which narratives are gaining virality so that fact-checkers can work alongside specific communities. Similar tools for monitoring YouTube content targeting diaspora communities would also be necessary for more effective fact-checking. Another important path for future research is TikTok. As this platform becomes more popular for video sharing, it will be fundamental to study how political actors could be targeting diaspora communities with misinformation on TikTok.

While most of our findings are a result of our qualitative interviews with both Latinx and Asian diaspora community leaders, we add another layer of evidence with the quantitative analysis of Telegram groups targeting Latinx communities. We focused our quantitative analysis on Latinx communities because those were the participants who provided us with direct recommendations for misinformation spreaders on Telegram groups, which allowed us to find these groups and confirm the qualitative findings from our interviews. Thus, a limitation of this study is that it does not explore Asian-related groups quantitatively. However, there is no intention to generalize these findings, both qualitative and quantitative, to all diaspora communities in the United States, which would be naïve given the diversity of these different nationalities, ethnicities, and cultural groups. Instead, we seek to bring insight about how misinformation has been targeting these communities through EMAs and how we can co-design infrastructures to identify and address their needs for combating misinformation.

Topics
Download PDF
Cite this Essay

Ozawa, J. V. S., Woolley, S., & Lukito, J. (2024). Taking the power back: How diaspora community organizations are fighting misinformation spread on encrypted messaging apps. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-146

Links

Bibliography

Austin, E. W., Borah, P., & Domgaard, S. (2021). COVID-19 disinformation and political engagement among communities of color: The role of media literacy. Harvard Kennedy School (HKS) Misinformation Review, 1(7). https://doi.org/10.37016/mr-2020-58

Bastos, M., & Farkas, J. (2019). “Donald Trump is my president!”: The Internet Research Agency propaganda machine. Social Media + Society, 5(3), 2056305119865466. https://doi.org/10.1177/2056305119865466

Budiman, A. (2020, August 20). Key findings about U.S. immigrants. Pew Research Center. https://www.pewresearch.org/short-reads/2020/08/20/key-findings-about-u-s-immigrants/

Freelon, D., & Lokot, T. (2020). Russian Twitter disinformation campaigns reach across the American political spectrum. Harvard Kennedy School (HKS) Misinformation Review, 1(1). https://doi.org/10.37016/mr-2020-003

Garimella, K., & Eckles, D. (2020). Images and misinformation in political groups: Evidence from WhatsApp in India. Harvard Kennedy School (HKS) Misinformation Review, 1(5). https://doi.org/10.37016/mr-2020-030

Guess, A., & Lyons, B. (2020). Misinformation, disinformation, and online propaganda. In N. Persily & J. Tucker (Eds.), Social media and democracy: The state of the field, prospects for reform (pp. 10–33). Cambridge University Press.

Glaser, B. G., & Strauss, A. L. (1967). Discovery of grounded theory: Strategies for qualitative research. Taylor & Francis.

Gursky, J., Riedl, M. J., & Woolley, S. (2021, March 19). The disinformation threat to diaspora communities in encrypted chat apps. Brookings. https://www.brookings.edu/techstream/the-disinformation-threat-to-diaspora-communities-in-encrypted-chat-apps/

Kapantai, E., Christopoulou, A., Berberidis, C., & Peristeras, V. (2021). A systematic literature review on disinformation: Toward a unified taxonomical framework. New Media & Society, 23(5), 1301–1326. https://doi.org/10.1177/1461444820959296

Kemei, J., Alaazi, D. A., Tulli, M., Kennedy, M., Tunde-Byass, M., Bailey, P., Sekyi-Otu, A., Murdoch, S., Mohamud, H., Lehman, J., & Salami, B. (2022). A scoping review of COVID-19 online mis/disinformation in Black communities. Journal of Global Health, 12, 05026. https://doi.org/10.7189/jogh.12.05026

Lee, A. Y., Moore, R. C., & Hancock, J. T. (2023). Designing misinformation interventions for all: Perspectives from AAPI, Black, Latino, and Native American community leaders on misinformation educational efforts. Harvard Kennedy School (HKS) Misinformation Review, 4(1). https://doi.org/10.37016/mr-2020-111

Lukito, J., & Pruden, M. L. (2023). Critical computation: Mixed-methods approaches to big language data analysis. Review of Communication, 23(1), 62–78. https://doi.org/10.1080/15358593.2022.2125821

Mazzei, P., & Medina, J. (2020, October 21). False political news in Spanish pits Latino voters against Black Lives Matter. The New York Timeshttps://www.nytimes.com/2020/10/21/us/politics/spanish-election-2020-disinformation.html

Newman, N., Fletcher, R., Schulz, A., Andi, S., & Nielsen, R. K. (2020). Reuters Institute digital news report 2020. Reuters Institute for the Study of Journalism.  https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2020-06/DNR_2020_FINAL.pdf

Nguyễn, S., Kuo, R., Reddi, M., Li, L., & Moran, R. E. (2022). Studying mis- and disinformation in Asian diasporic communities: The need for critical transnational research beyond Anglocentrism. Harvard Kennedy School (HKS) Misinformation Review, 3(2). https://doi.org/10.37016/mr-2020-95

Ozawa, J. V. S., Woolley, S. C., Straubhaar, J., Riedl, M. J., Joseff, K., & Gursky, J. (2023). How disinformation on WhatsApp went from campaign weapon to governmental propaganda in Brazil. Social Media + Society, 9(1). https://doi.org/10.1177/20563051231160632

Reddi, M., Kuo, R., & Kreiss, D. (2021). Identity propaganda: Racial narratives and disinformation. New Media & Society, 25(8), 2201–2218. https://doi.org/10.1177/14614448211029293

Resende, G., Messias, J., Silva, M., Almeida, J., Vasconcelos, M., & Benevenuto, F. (2018). A system for monitoring public political groups in WhatsApp. In WebMedia ’18: Proceedings of the 24th Brazilian symposium on multimedia and the web (pp. 387–390). Association for Computing Machinery. https://doi.org/10.1145/3243082.3264662

Riedl, M. J., Ozawa, J. V. S., Woolley, S., & Garimella, K. (2022). Talking politics on WhatsApp: A survey of Cuban, Indian, and Mexican American diaspora communities in the United States [White paper]. Center for Media Engagement. https://mediaengagement.org/research/cuban-indian-mexican-american-communities-in-the-united-states

Straubhaar, J. D. (1991). Beyond media imperialism: Assymetrical interdependence and cultural proximity. Critical Studies in Mass Communication, 8(1), 39–59. https://doi.org/10.1080/15295039109366779

Tandoc Jr., E. C. (2019). The facts of fake news: A research review. Sociology Compass, 13(9), e12724. https://doi.org/10.1111/soc4.12724

Funding

This study is a project of the Center for Media Engagement (CME) at the University of Texas at Austin, where research is supported by the Open Society Foundations, Omidyar Network, as well as the John S. and James L. Knight Foundation.

Competing Interests

The authors declare no competing interests.

Ethics

The study received approval from the Institutional Review Board (IRB) of the University of Texas at Austin on October 31, 2019. Human subjects provided informed consent to participate in this study. We approached leaders who worked in diaspora community organizations committed to the advancement of Latinx and Asian communities in the United States, but the investigators did not determine ethnicity or gender categories when recruiting the participants.

Copyright

This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided that the original author and source are properly credited.

Data Availability

To protect the privacy of our participants, the interview memos and recordings used in this study cannot be released. To protect the privacy of Telegram users, the content collected on Telegram groups also cannot be released.