Abstract
Why does online distrust (e.g., of medical expertise) continue to grow despite numerous mitigation efforts? We analyzed changing discourse within a Facebook ecosystem of approximately 100 million users who were focused pre-pandemic on vaccine (dis)trust. Post-pandemic, their discourse interconnected multiple non-vaccine topics and geographic scales within and across communities. This interconnection confers a unique, system-level (i.e., at the scale of the full network) resistance to mitigations targeting isolated topics or geographic scales—an approach many schemes take due to constrained funding. For example, focusing on local health issues but not national elections. Backed by numerical simulations, we propose counterintuitive solutions for more effective, scalable mitigation: utilize “glocal” messaging by blending (1) strategic topic combinations (e.g., messaging about specific diseases with climate change) and (2) geographic scales (e.g., combining local and national focuses).
Similar content being viewed by others
Introduction
Distrust and its associated mis/disinformation—however defined—is now a widespread threat to public health (e.g., abortion, COVID-19, mpox (previously called monkeypox)), science (e.g., climate change), election processes and even national security1,2,3,4,5. The pandemic exacerbated this issue as many people turned to their trusted online communities for advice and to share distrust of official health messaging6,7,8,9,10,11,12,13,14,15,16,17,18. Within a month of the U.S. national emergency declaration19, Facebook—the largest and most widely used social media platform—saw a 50% increase in messaging and 70% increase in time spent20, driving its monthly active users to 2.6 billion.
To combat the growth of online distrust and mis/disinformation, myriad ingenious mitigation strategies have been introduced and implemented on social media platforms12,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38. For instance, Facebook has funded internal efforts like adding misinformation labels to posts25, while The Mercury Project, spearheaded by the Social Science Research Council with support from philanthropic organizations like The Rockefeller Foundation, Robert Wood Johnson Foundation, and Craig Newmark Philanthropies, has funded vaccine promotion campaigns28. Depending on a mitigation scheme's funding source, its focus is typically on a specific topic (e.g., COVID-19, elections, climate change), and geographical scale, such as a specific state, national (e.g., APS), or worldwide (e.g., E.U.). However, despite the diversity of mitigation schemes, distrust continues to be widespread.
Here we provide an answer to this question, and the counterintuitive solution that this answer suggests. Specifically, we show that post-pandemic distrust has developed a massive glocal web that—within individual communities and across interconnected communities—blends distinct topics, locations, and geographic scales. This makes it resilient to current mitigation schemes that only focus on a specific topic or geographic scale (e.g. due to funding mandate)39. Given that such schemes also operate independently, this suggests widespread distrust would remain resilient even if these schemes were implemented at mass scale. We show this in Figs. 1, 2 and 3 by analyzing the post-pandemic discourse across the Facebook ecosystem of approximately 100 million individuals that—pre-pandemic—was centered on vaccine distrust40. Combining this with an agent-based simulation, Fig. 4 shows how this web-of-distrust can be dismantled by making individual mitigation schemes blend topics and scales.
Methods
To examine how the distrust discourse changed post-pandemic, we revisited the 2019 Facebook ecosystem from Ref.40 that had centered around vaccines and comprised interlinked anti-, pro-, and neutral-vaccination Facebook pages. Our full methodology is given in the SI and follows Ref.40. Each Facebook page is a community (i.e., node in Fig. 1) with a unique ID, and has nothing to do with community detection in networks. These communities provide spaces where users gather around shared interests, thereby promoting trust among them41,42,43,44,45,46 and potential collective distrust of other issues6,22,40,47,48,49,50,51,52,53,54.
Our trained researchers manually and independently classify each community involved in the vaccine debate, with subsequent consensus checks performed in cases of disagreement. This yielded a network of 1356 interlinked communities across countries and languages, with 86.7 million individuals in the largest network component; 211 pro-vaccination communities (blue nodes, Fig. 1B) with 13.0 million individuals; 501 anti-vaccination communities (red nodes, Fig. 1B) with 7.5 million individuals; 644 neutral communities (non-blue or red nodes, Fig. 1B) with 66.2 million individuals. These neutral communities were further sub-categorized by type based on their title and description (e.g., parenting).
The discourse within each of the 1356 communities was categorized by topic prevalence. Though topics outside the five dominant ones (COVID-19, mpox, abortion, climate change, and elections) are mentioned, their frequency is generally much lower (e.g., sports). To categorize the discourse topics within each community, we developed word filters for five non-vaccine topics: COVID-19, mpox, abortion, elections, and climate change. By 'non-vaccine' we mean topics not centered on vaccines in a broad sense; although the COVID-19 and mpox filters included some vaccination-related terms (for e.g., “monkeepox vax’n”), the intent was to capture discourse specifically about these diseases, not vaccines generally. After all, if we had been filtering for general vaccine discussion rather than disease-specific discourse, it would have been difficult to reliably distinguish COVID-19-related posts from mpox-related posts in an automated manner without additional human classification. The filters combined regular expressions and keyword searches across post content, descriptions, image tags, and link text in multiple languages (see SI Sect. 7 for details on the filtering methodology).
A link is shown between two communities (Facebook pages) A and B when community A recommends community B to its members at the page level. This creates a prominent hyperlink from A to B indicating community A's interest in B, which is different than if a member of A had simply mentioned some content from B. A link does not necessarily mean the two communities agree. Instead, it directs the attention of A's members to B, and vice versa it exposes A to feedback and content from B. While not all members will necessarily pay attention, a committed minority of 25% can be enough to influence the stance of an online community55.
Of the 1356 communities, 342 identify as local and have around 3.1 million individuals, while the remaining 1014 communities are global with around 83.7 million individuals. In terms of geography, a global community is a page with broad, worldwide focus that is not tied to a specific location, while a local community is focused on a specific geographic area, such as a neighborhood, city, county, state, or country (e.g., "Vaccine information for parents" or "Global Trends" vs "Vaccine information for Los Angeles County parents"). In terms of topic, a global community discusses diverse issues broadly, whereas a local community has a narrow topical focus (e.g., pages discussing only elections). The size of each community can be estimated by the number of likes, given that the average user only likes one Facebook page on average7—however, our analysis and findings are not dependent on this.
Thus, the terms ‘global’ and ‘local’ are applied in two dimensions—geographic and topic—to highlight the interconnection between geographic and topic glocality. This dual usage of ‘global’ and ‘local’ elucidates how geographic and topic glocality may interrelate and allow communities to occupy different glocal positions. For instance, a community focused on a narrow topic within a small locality embodies hyperlocality on both dimensions, while one that discusses many topics worldwide embodies hyperglobality. Our analysis aims to demonstrate that post-pandemic, vaccine skepticism discourse expanded beyond just hyperlocal geography and topics to encompass more hyperglobal geography and topics. The terms ‘global’ and ‘local’ effectively convey this evolution across both dimensions.
In our numerical simulations, ‘mitigation’ refers to efforts that aim to counter mis/disinformation, such as fact-checking, verification, public awareness campaigns, collaborative initiatives, research, and global programs, rather than banning communities. These strategies have been implemented to address false narratives related to topics like COVID-19, public health, elections, and climate change12,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38. For instance, Facebook has funded internal efforts like adding misinformation labels to posts, while the Mercury Project has conducted vaccine promotion campaigns25,28. Many strategies operate at a national level, like the UK's ‘Don't Feed the Beast’ campaign23 while others have a global focus, like the United Nations' VERIFIED initiative26. The strategies aim to limit the spread and impact of misinformation by encouraging critical thinking, accurate reporting, and authoritative voices. However, persistent exposure to narratives via social connections can enable reactivation of silenced misinformation.
Thus, ‘deactivation’ refers to temporarily 'silencing' communities by removing their network links as part of a mitigation campaign focused on a particular topic or locality. However, 'reactivation' allows deactivated communities to resume discussing the targeted topic based on continued exposure to narratives via their remaining network connections. This represents the limited, temporary effectiveness of real-world debunking efforts, with misinformation narratives persisting over time. Our agent-based model thus simulates the effects of different mitigation strategies by ‘deactivating’ communities through removing their network connections, then allowing ‘reactivation’ based on remaining links. The geography-focused simulation tests the impact of messaging targeted at either local or global communities, whereas the topic-focused simulation compares single versus multi-topic debunking (see SI Sect. 10 and 11 for full details). The goal is to capture general principles around network resilience despite the known limited impacts of current mitigation strategies.
Results
Figure 1 shows this web-of-distrust during the period 5/1/2022 to 10/17/2022, which included several significant events: (1) the first confirmed U.S. case of the mpox outbreak56; (2) the U.S. Supreme Court's reversal of Roe v. Wade57; (3) President Biden signing into law the Inflation Reduction Act58; (4) primary and run-off elections ahead of the November midterms59. Communities that share more links appear visually closer together and have a higher likelihood of exerting influence on each other through shared content and infiltration. This is because the layout results from a color-agnostic physical calculation (ForceAtlas2) in which nodes repel each other with a force that decays with separation, and linked nodes have an additional attractive spring force60.
This web-of-distrust entangles 5 dominant topics within and across communities: abortion, mpox, COVID-19, climate change, elections (panels C–G). This means that within a given community, distrust in one topic can immediately be reinforced by distrust of another topic(s) as well as by the collective distrust of other communities to which it is linked. Hence post-pandemic intervention on one topic and scale invites pushback from distrust on other topics and across scales, making the distrust ecosystem resilient to mitigation schemes that focus on a particular topic and geographic scale. For example, distrust of state elections within a community associated with a small U.S. city is reinforced by distrust in the U.S. federal government on mpox, which is then reinforced by distrust over climate change from another community representing itself as the U.K. mainstream. This global-local entanglement across topics and scales adds new resilience to the distrust ecosystem.
Figure 1 has counterintuitive implications for messaging the public and hence intervening against distrust (see SI for details and statistical analyses). One might expect discussions surrounding elections and abortion to focus on a specific geographical scale within the U.S. due to specific laws, politics, and healthcare systems. However, a chi-square test shows the opposite is true. Among communities only discussing a single topic, 25.21%, 35.14%, 17.65%, 38.71%, and 24.32% that discuss COVID-19, mpox, abortion, elections, and climate change, respectively, are local—in contrast to the expected 31.9% if there were no correlation. This expected value of 31.9% comes from the chi-squared test results, specifically the expected counts under the null hypothesis of no relationship between topic and geographic locale (see SI Sect. 7). This implies that any single-topic messaging around COVID-19, or abortion, or climate change, should have more of a global perspective, while that around mpox and elections should be more local. This difference between mpox and COVID-19 also warns against one-size-fits-all for public health messaging, despite both being emerging diseases.
Moreover, traditional anti-vax communities—a third of which are local—have a lower interest in abortion (Fig. 1C). Climate change is the most popular topic among illness communities, of which only a few are local (Fig. 1D). Parenting communities are closely associated with alternative health communities and discuss many topics (Fig. 1F). Conspiracy theory communities show high interest in abortion and elections and are 50% local (Fig. 1G). GMO communities have the highest percentage of local members (53.8%) as compared to all communities in Fig. 1B for which only 25.2% are local.
Figure 2 confirms the wide distribution across topics and geographic scales of the distrust discourse. For anti-vaccination communities, this is the number of topics about which distrust is being actively promoted. One might expect that as a community addresses more topics, the number of potential flashpoints for internal disagreements would increase—hence there would be less communities and individuals engaging in higher numbers of topics—but the opposite happens. Furthermore, local communities are overrepresented in topics compared to the full dataset: hence future mitigation schemes, including global ones, should be designed such that local communities and their interests feature in a prominent way.
Figure 3A,B show the importance of specific combinations of topics for effective public messaging. The subset of communities discussing all five topics is the third largest but has the highest number of individuals at 19.9 million. There are fewer communities that only discuss mpox, abortion, or elections. There is significant conversation overlap between COVID-19 and climate change, particularly in global communities. Despite the involvement of pro-vaccination communities in these discussions, the dialog is mostly led by communities that do not promote guidance consistent with current scientific consensus—and in many cases, these are communities that actively oppose it, especially at the local level.
The heatmaps in Fig. 3C,D compare how different pairs of topics were discussed across the distrust ecosystem during key 2022 periods. In Fig. 3C, which includes the first U.S. mpox case, we observe a greater proportion of anti-vaccination communities discussing mpox in connection with COVID-19 compared to pro-vaccination communities; the latter continued to focus more on COVID-19 and climate change interactions (see SI Sect. 9 for full details). This distribution suggests potential gaps in authoritative medical guidance about mpox during this early stage. In Fig. 3D, coinciding with the reversal of Roe v. Wade, pro-vaccination communities discussed abortion predominantly in the context of COVID-19, while neutral communities focused on abortion and climate change connections. However, a higher proportion of anti-communities maintained messaging linking COVID-19 with climate and elections. This observed messaging pattern combining dominant topics resembles a phenomenon akin to real multi-virus interference62: the strategic blending of topics helps suppress distrust around other issues. By examining the relative size of communities over time, we uncover occasional breakthroughs where pro-groups temporarily gain control of specific topic interactions, despite their smaller overall numbers at the 2-topic level (SI Sect. 9).
Figure 4 uses an agent-based simulation to compare the effectiveness of mitigation schemes that target a specific topic or geographic scale versus schemes that blend topics and scales. The model simulates ‘deactivating’ communities by removing their network links, then allows ‘reactivation’ based on connections. In the geography-focused simulation (Fig. 4A, SI Sect. 10), local nodes are randomly ‘deactivated’ at each time step to represent geographically targeted debunking/fact-checking campaigns focused on that locality. Deactivated nodes can then reactivate based on the proportion of global pages they follow. The topic-focused simulation (Fig. 4B, SI Sect. 11) evaluates the impact of single vs. multi-topic messaging. Topics are chosen at a granular level (e.g., “COVID-19”, “COVID-19 and mpox”), and nodes discussing a targeted topic have their discussion suppressed to model the application of topic-specific debunking.
Nodes can then reactivate and resume targeted discussions based on network connections to other active nodes, representing the limited effectiveness of real-world debunking. Specifically, reactivation likelihood is determined by the proportion of a node's connections still actively discussing the targeted topic or locality. This reactivation component, based on ongoing content exposure through network links, captures the stubborn persistence of narratives despite isolated mitigation attempts. Both simulations were performed over 1500 iterations, with results averaged (see SI Sect. 10–11 for details).
In Fig. 4A, a local community's chance of reactivation is determined by the proportion of global communities it follows out of its total connections. This reactivation probability remains static since the geographic global/local status of a community is constant. Conversely, in Fig. 4B, the reactivation likelihood for a community to resume discussing a specific topic is based on the fraction of connections currently posting about that topic. This probability is dynamic, being recalculated at every timestep of the simulation due to the ever-changing nature of topic discussions among communities. Notably, communities that follow only local communities or do not follow any communities discussing the topic have a 0% chance of reactivation. In contrast, those following only global communities or all communities discussing the topic have a 100% chance of reactivation.
We thus find that post-pandemic, the Facebook communities originally focused on vaccines strongly entangle multiple non-vaccine topics and geographic scales both within and across communities. As demonstrated by our simulations, this gives the current distrust ecosystem a unique system-level resistance to mitigations that target a specific topic or geographic scale—which is the case of many current schemes due to their funding focus. The geographic scale simulation (Fig. 4A) shows it is not possible to ‘deactivate’ all local communities with messaging focused solely on the local level, due to reactivation driven by interconnectedness. The topic scale simulation (Fig. 4B) demonstrates superior effectiveness of multi-topic compared to single-topic messaging for reducing discussions, enabled by the entanglement across topics. The curves in Fig. 4B show how the number of communities discussing a particular topic decrease over time due to the mitigation strategy, and the overall trends are represented by averaged reduction curves. Separate curve components that were averaged can be found in SI Sect. 11. The effectiveness of the type of messaging was measured by the rate and magnitude of the decrease in the number of communities over time. Even for two-topic messaging, the proportion of communities discussing a topic decrease to less than 50% in under 600 steps, and further decreases with more topics. Though 3-, 4-, and 5-topic messaging perform similarly, these results show two-topic messaging should be employed by default in future mitigation schemes, as complete knowledge of the distrust web is not required to implement this approach effectively.
Discussion
Our results reveal new insights into the structure and shifts within the online vaccine distrust ecosystem on Facebook, pointing to potential improvements in mitigation strategies. We analyzed a large Facebook ecosystem of ~ 100 million users focused pre-pandemic on vaccine attitudes. Post-pandemic, their conversations blended multiple topics and geographic scales, conferring system-level resistance to targeted interventions. This highlights gaps in current approaches, which are often constrained by the narrow funding focuses of supporting entities. For example, government-funded efforts typically target misinformation only on certain topics relevant to public health or elections12,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38. Instead, effective mitigation may require “glocal” messaging combining strategic topic and scale mixes. For example, pairing diseases with climate change, or local and national focuses.
Our dataset represents only select languages popular on Facebook63, however SI Sect. 2 uses page administrator locations as a proxy to show that our dataset is indeed diverse. Additional sentiment analysis or natural language processing of posts could further enrich insights into the dataset, and of course other social media platforms exist. We note that although our study is technically a large sample of the actual online population, the large number involved (approximately 100 million) suggests it qualifies as a crude population-level map. Indeed, we did not obtain the nodes and links by simple sampling but rather by detection and then following links from node to node. After a while, this tended to return to the same nodes and hence, like circling the globe, hints that we have mapped—albeit crudely—the skeleton of the true online distrust ecosystem. Thus, despite limitations, these initial maps of this ecosystem’s structure and post-pandemic shifts already indicate deficiencies in current mitigation efforts while pointing to alternative strategies worthy of deeper exploration.
Data availability
All data needed to evaluate the conclusions in the paper are present in the paper and Supplementary Information (Refs.40,42,55,56,57,58,59,64,65,66,67,68,69,70,71). The code used to generate the map in Fig. 1, and from which the results in Figs. 2 and 3 are obtained, is Gephi which is free open-source software. Figure 4 was obtained using Mathematica.
References
Nobel Prize Summit. NobelPrize.org. https://www.nobelprize.org/events/nobel-prize-summit/2023/.
Trust Science Pledge Calls for Public to Engage in Scientific Literacy. News Direct. https://newsdirect.com/news/trust-science-pledge-calls-for-public-to-engage-in-scientific-literacy-737528151.
Catherine Meyers, American Physical Society Takes On Scientific Misinformation. http://aps.org/publications/apsnews/202203/misinformation.cfm
Zaid, J.. AAAS 2022 Annual Meeting: How to Tackle Mis- and Dis-information|American Association for the Advancement of Science (AAAS). https://www.aaas.org/news/aaas-2022-annual-meeting-how-tackle-mis-and-dis-information.
Beyond Disinformation—EU Responses to the Threat of Foreign Information Manipulation (2023). https://www.youtube.com/watch?v=YJf2pZGe36Q.
Ardia, D. S., Ringel, E., Ekstrand, V. & Fox, A. Addressing the decline of local news, rise of platforms, and spread of mis- and disinformation online: A summary of current research and policy proposals. SSRN J. https://doi.org/10.2139/ssrn.3765576 (2020).
H. Inc, Digital Trends—Digital Marketing Trends. 2022. https://www.hootsuite.com.
Lappas, G., Triantafillidou, A., Deligiaouri, A. & Kleftodimos, A. Facebook content strategies and citizens’ online engagement: The case of Greek local governments. Rev. Socionetwork. Strat. 12, 1–20 (2018).
Kleineberg, K.-K. & Boguñá, M. Competition between global and local online social networks. Sci. Rep. 6, 25116 (2016).
Rao, A., Morstatter, F. & Lerman, K. Partisan asymmetries in exposure to misinformation. Sci. Rep. 12, 15671 (2022).
Weight-loss injections have taken over the internet. But what does this mean for people IRL?. MIT Technology Review. https://www.technologyreview.com/2023/03/20/1070037/weight-loss-injections-societal-impact-ozempic/.
Getting Ahead of Misinformation. Democracy Journal (2023). https://democracyjournal.org/magazine/68/getting-ahead-of-misinformation/.
R. DiResta. The Digital Maginot Line. ribbonfarm (2018). https://www.ribbonfarm.com/2018/11/28/the-digital-maginot-line/.
Misinformation, Crisis, and Public Health—Reviewing the Literature—MediaWell. https://mediawell.ssrc.org/?post_type=ssrc_lit_review&p=58936.
Larson, H. J. Blocking information on COVID-19 can fuel the spread of misinformation. Nature 580, 306–306 (2020).
Douek, E. Content moderation as systems thinking. Harvard Law Review, 136 (2022). https://harvardlawreview.org/print/vol-136/content-moderation-as-systems-thinking/.
Chen, E., Lerman, K. & Ferrara, E. Tracking social media discourse about the COVID-19 pandemic: Development of a public coronavirus twitter data set. JMIR Public Health Surveill. 6, e19273 (2020).
Semenov, A. et al. Exploring social media network landscape of post-soviet space. IEEE Access. 7, 411–426 (2019).
S. Ghaffary, People are using Facebook more than ever during the coronavirus pandemic—but its business is still taking a hit. Vox (2020). https://www.vox.com/2020/4/29/21241601/facebook-coronavirus-pandemic-users-advertising-growth-making-losing-money-users-q1-2020-earnings.
Keeping Our Services Stable and Reliable During the COVID-19 Outbreak. Meta (2020). https://about.fb.com/news/2020/03/keeping-our-apps-stable-during-covid-19/.
What’s Being Done to Fight Disinformation Online. https://www.rand.org/research/projects/truth-decay/fighting-disinformation.html.
Managing the COVID-19 infodemic: Promoting healthy behaviours and mitigating the harm from misinformation and disinformation. https://www.who.int/news/item/23-09-2020-managing-the-covid-19-infodemic-promoting-healthy-behaviours-and-mitigating-the-harm-from-misinformation-and-disinformation.
Government to relaunch ‘Don’t Feed the Beast’ campaign to tackle Covid-19 misinformation—Society of Editors. https://www.societyofeditors.org/soe_news/government-to-relaunch-dont-feed-the-beast-campaign-to-tackle-covid-19-misinformation/.
Get the facts on coronavirus. Full Fact. https://fullfact.org/health/coronavirus/.
COVID-19 Information Center | Meta. COVID-19 Information Center. https://about.meta.com/covid-19-information-center.
VERIFIED: UN launches new global initiative to combat misinformation. Africa Renewal (2020). https://www.un.org/africarenewal/news/coronavirus/covid-19-united-nations-launches-global-initiative-combat-misinformation.
Restoring Trust in Public Health (2023). https://doi.org/10.26099/j7jk-j805.
New USD10 Million Project Launched To Combat the Growing Mis- and Disinformation Crisis in Public Health. The Rockefeller Foundation. https://www.rockefellerfoundation.org/news/new-usd10-million-project-launched-to-combat-the-growing-mis-and-disinformation-crisis-in-public-health/.
Navigating Infodemics and Building Trust during Public Health Emergencies A Workshop | National Academies. https://www.nationalacademies.org/event/04-10-2023/navigating-infodemics-and-building-trust-during-public-health-emergencies-a-workshop.
Understanding and Addressing Misinformation About Science A Public Workshop | National Academies. https://www.nationalacademies.org/event/04-19-2023/understanding-and-addressing-misinformation-about-science-a-public-workshop.
Mirza, S., et al., Tactics, threats and targets: Modeling disinformation and its mitigation. In Proceedings 2023 Network and Distributed System Security Symposium (Internet Society, San Diego, 2023). https://www.ndss-symposium.org/wp-content/uploads/2023/02/ndss2023_s657_paper.pdf.
Roozenbeek, J., van der Linden, S., Goldberg, B., Rathje, S. & Lewandowsky, S. Psychological inoculation improves resilience against misinformation on social media. Sci. Adv. 8, eabo6254 (2022).
van der Linden, S., Leiserowitz, A., Rosenthal, S. & Maibach, E. Inoculating the public against misinformation about climate change. Glob. Chall. 1, 1600008 (2017).
Rory, S., Seb, C., & Claire, W. Under the surface: Covid-19 vaccine narratives, misinformation and data deficits on social media (First Draft, 2020). https://firstdraftnews.org/long-form-article/under-the-surface-covid-19-vaccine-narratives-misinformation-and-data-deficits-on-social-media/.
Calleja, N. et al. A public health research agenda for managing infodemics: Methods and results of the first WHO infodemiology conference. JMIR Infodemiol. 1, e30979 (2021).
Lazer, D. M. J. et al. The science of fake news. Science 359, 1094–1096 (2018).
Lewandowsky, S. et al. Debunking Handbook 2020 1–19 (George Mason University, 2020).
Green, Y., Gully, A., Roth, Y., Roy, A., & Tucker, J. A. Evidence-Based Misinformation Interventions: Challenges and Opportunities for Measurement and Collaboration. Carnegie Endowment for International Peace. https://carnegieendowment.org/2023/01/09/evidence-based-misinformation-interventions-challenges-and-opportunities-for-measurement-and-collaboration-pub-88661.
glocal, adj. OED Online. https://www.oed.com/view/Entry/276090.
Johnson, N. F. et al. The online competition between pro- and anti-vaccination views. Nature 582, 230–233 (2020).
Madhusoodanan, J. Safe space: Online groups lift up women in tech. Nature 611, 839–841 (2022).
Moon, R. Y., Mathews, A., Oden, R. & Carlin, R. Mothers’ perceptions of the internet and social media as sources of parenting and health information: Qualitative study. J. Med. Internet Res. 21, e14289 (2019).
Ammari, T., & Schoenebeck, S. “Thanks for your interest in our Facebook group, but it’s only for dads”: Social Roles of Stay-at-Home Dads. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, CSCW ’16, 1363–1375 (Association for Computing Machinery, 2016). https://doi.org/10.1145/2818048.2819927.
Laws, R. et al. Differences between mothers and fathers of young children in their use of the internet to support healthy family lifestyle behaviors: Cross-sectional study. J. Med. Internet Res. 21, e11454 (2019).
Forsyth, D. R. Group Dynamics 6th edn. (Wadsworth Cengage Learning, 2014).
Gelfand, M. J., Harrington, J. R. & Jackson, J. C. The strength of social norms across human groups. Perspect. Psychol. Sci. 12, 800–809 (2017).
Tram, K. H. et al. Deliberation, dissent, and distrust: Understanding distinct drivers of Coronavirus Disease 2019 vaccine hesitancy in the United States. Clin. Infect. Dis. 74, 1429–1441 (2022).
Pertwee, E., Simas, C. & Larson, H. J. An epidemic of uncertainty: Rumors, conspiracy theories and vaccine hesitancy. Nat. Med. 28, 456–459 (2022).
Freiling, I., Krause, N. M., Scheufele, D. A. & Brossard, D. Believing and sharing misinformation, fact-checks, and accurate information on social media: The role of anxiety during COVID-19. New Media Soc. 25, 141–162 (2023).
Song, H. et al. What message features influence the intention to share misinformation about COVID-19 on social media? The role of efficacy and novelty. Comput. Hum. Behav. 138, 107439 (2023).
Arriagada, A. & Ibáñez, F. “You Need At Least One Picture Daily, if Not, You’re Dead”: Content creators and platform evolution in the social media ecology. Soc. Media Soc. 6, 2056305120944624 (2020).
Kim, Y. A. & Ahmad, M. A. Trust, distrust and lack of confidence of users in online social media-sharing communities. Knowl.-Based Syst. 37, 438–450 (2013).
Chen, X., Sin, S.-C.J., Theng, Y.-L. & Lee, C. S. Why students share misinformation on social media: Motivation, gender, and study-level differences. J. Acad. Librariansh. 41, 583–592 (2015).
Chen, X., Sin, S.-C. J., Theng, Y.-L., & Lee, C. S. Why Do Social Media Users Share Misinformation?. In Proceedings of the 15th ACM/IEEE-CS Joint Conference on Digital Libraries, JCDL ’15, 111–114 (Association for Computing Machinery, 2015). https://doi.org/10.1145/2756406.2756941.
Centola, D., Becker, J., Brackbill, D. & Baronchelli, A. Experimental evidence for tipping points in social convention. Science 360, 1116–1119 (2018).
Constantino, A. K. Health officials confirm first U.S. case of monkeypox virus this year in Massachusetts. CNBC (2022). https://www.cnbc.com/2022/05/19/monkeypox-virus-case-confirmed-in-massachusetts.html.
Liptak, A. In 6-to-3 Ruling, Supreme Court Ends Nearly 50 Years of Abortion Rights. The New York Times (2022). https://www.nytimes.com/2022/06/24/us/roe-wade-overturned-supreme-court.html.
OLCA, The President Signs H.R. 5376, the “Inflation Reduction Act of 2022. https://www.ssa.gov/legislation/legis_bulletin_081622.html.
2022 Midterm Election Calendar-270toWin. 270toWin.com. https://www.270towin.com/2022-election-calendar/.
Jacomy, M., Venturini, T., Heymann, S. & Bastian, M. ForceAtlas2, a continuous graph layout algorithm for handy network visualization designed for the Gephi software. PLoS One 9, e98679 (2014).
Pérez-Silva, J. G., Araujo-Voces, M. & Quesada, V. nVenn: Generalized, quasi-proportional Venn and Euler diagrams. Bioinformatics 34, 2322–2324 (2018).
Dance, A. How one virus can block another. https://www.bbc.com/future/article/20230210-can-you-get-two-viruses-at-the-same-time.
Dixon, S. Facebook users by country 2023. Statista (2023). https://www.statista.com/statistics/268136/top-15-countries-based-on-number-of-facebook-users/.
Johnson, N. F. et al. New online ecology of adversarial aggregates: ISIS and beyond. Science 352, 1459–1463 (2016).
Johnson, N. F. et al. Hidden resilience and adaptive dynamics of the global online hate ecology. Nature 573, 261–265 (2019).
RhysLeahy, R. F. S., Restrepo, N. J., Lupu, Y. & Johnson, N. F. Machine learning language models: Achilles heel for social media platforms and a possible solution. AAIML 01, 191–202 (2021).
Sunstein, C. R. #Republic: Divided Democracy in the Age of Social Media, NED-New edition. (Princeton University Press, 2018). https://www.jstor.org/stable/j.ctv8xnhtd.
World Map of Social Networks. Vincos-il blog di Vincenzo Cosenza. https://vincos.it/world-map-of-social-networks/.
Lampe, C., Vitak, J., Gray, R., & Ellison, N. Perceptions of facebook’s value as an information source. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’12, 3195–3204 (Association for Computing Machinery, 2012). https://doi.org/10.1145/2207676.2208739.
Silver, A. & Matthews, L. The use of Facebook for information seeking, decision support, and self-organization following a significant disaster. Inf. Commun. Soc. 20, 1680–1697 (2017).
Illari, L. & Johnson, N. F. Network resilience in the face of deplatforming: The online anti-vaccination movement and COVID-19 (2022). https://researchshowcase2022-gwu.ipostersessions.com/?s=7F-4B-0F-52-FE-EB-CA-63-04-D1-99-44-89-00-52-E6.
Acknowledgements
N.F.J. is supported by U.S. Air Force Office of Scientific Research awards FA9550-20-1-0382 and FA9550-20-1-0383.
Author information
Authors and Affiliations
Contributions
L.I. analyzed the results and generated the figures. N.F.J. supervised the project. L.I. and N.F.J. wrote the paper. All authors were involved in reviewing the final manuscript, and in the conceptualization, methodology, and validation.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Illari, L., Restrepo, N.J. & Johnson, N.F. Rise of post-pandemic resilience across the distrust ecosystem. Sci Rep 13, 15640 (2023). https://doi.org/10.1038/s41598-023-42893-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-023-42893-6
This article is cited by
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.