Dissemination

Internal code: 4.3

Target groups: (large) (social media) platforms, recipients

General description
 * Preventing disinformation to spread (disseminate).
 * (Van der Linden (2020) ) "Given the unprecedented scale and pace at which misinformation can now travel online, research has increasingly relied on models from epidemiology to understand the spread of fake news. In these models, the key focus is on the reproduction number (R0)—in other words, the number of individuals who will start posting fake news (that is, secondary cases) following contact with someone who is already posting misinformation (the infectious individual). It is therefore helpful to think of misinformation as a viral pathogen that can infect its host, spreading rapidly from one individual to another within a given network, without the need for physical contact. One benefit of this epidemiological approach lies in the fact that early detection systems could be designed to identify, for example, superspreaders, which would allow for the timely deployment of interventions to curb the spread of viral misinformation."

Interventions
 * (Pennycook, D., Rand, D. (2021)) "There is ... a large disconnect between what people believe and what they will share on social media, and this is largely driven by inattention rather than by purposeful sharing of misinformation. Effective interventions can nudge social media users to think about accuracy, and can leverage crowdsourced veracity ratings to improve social media ranking algorithms."
 * (Jankowicz, N. (2020) Dream. 2028. “Party leaders, editors-in-chief of the country’s newspapers, major broadcasters, and even the social media platforms convened to sign a Declaration of Truth and Integrity: they would not discuss, cover, or amplify content for which a legitimate source could not be established”

Dissemination Impact measurement

Assumptions Besides disseminating disinformation there is possibly a second element worth considering: self-censorship
 * (NATO 2021) "In the past decades, the development of modern means of communication has made the information landscape increasingly complex. It is no longer the preserve of a select few actors such as states and traditional media. Rather, it favors many-to-many interactions between individuals who generate a large proportion of the content themselves."
 * (NATO 2021) "The opening-up of the online information space has, however, been exploited by opportunistic actors to spread disinformation and propaganda faster and further than ever before. They take advantage of several vulnerabilities intrinsic to the nature of the online domain. First, by facilitating the creation and dissemination of information, online tools allow for a multiplication of sources. Although theoretically positive, this proliferation can overwhelm the public and make it difficult to assess the credibility of those sources and, therefore, to gauge the reliability of the information received (NATO Strategic Communications, 2016; Lazer et al., 2017). Second, although crucial to free speech, the anonymity of the online space allows ill-intentioned actors to spread harmful falsehoods covertly and effortlessly without taking responsibility for their effects (Cordy, 2017; Bremmer, 2010). Third, the algorithms that power social media platforms inadvertently contribute to the dissemination of disinformation and propaganda by confining users into homogeneous “echo chambers”."
 * (NATO 2021) "Some non-state actors, including terrorist organizations, far right and conspiracy theory movements, and informal groups motivated by monetary gain are developing sophisticated disinformation and propaganda capabilities. These actors harness evolving technologies – on which Allied societies rely – to further enhance their capacity to spread harmful narratives. The latter are disseminated primarily by individual citizens within our borders, either purposefully or unwittingly."
 * (NATO 2021) "Disinformation and propaganda, even when created by malicious exogenous actors, ultimately spread largely through the actions of individual citizens within our borders. In that sense, citizens are both the primary targets and the main promoters of false information."
 * (Jankowicz, N. (2020)) Is social media to blame? “But social media relies on user engagement with content to function, and Americans were quite happy to interact with and share sensationalist, divisive, and unfounded information they saw during the course of the election and beyond. What they didn’t know is that they were sharing content produced and posted from accounts run out of Russia’s Internet Research Agency [troll farm].”
 * (Jankowicz, N. (2020)) “... those drawn to disinformation and most likely to fall for its falsehoods are not searching for a new narrative; they are searching for a renewed, more responsive form of governance, restored trust in the state and the media, and faith in their futures in countries that have left them behind.”
 * (Jankowicz, N. (2020)) “Perhaps the gravest mistake that all of the countries in this book have made, including and especially the United States, is ignoring the / use of homegrown actors and domestic disinformation to amplify preexisting conflict and discord.”
 * (Van der Linden (2020) ) "Different polls show that over a third of people self-report frequent, if not daily exposure, to misinformation80. Of course, the validity of people’s self-reported experiences can be variable, but it raises questions about the accuracy of exposure estimates, which are often based on limited public data and can be sensitive to model assumptions. Moreover, a crucial factor to consider here is that exposure does not equal persuasion (or ‘infection’). For example, research in the context of COVID-19 headlines shows that people’s judgments of headline veracity had little impact on their sharing intentions. People may thus choose to share misinformation for reasons other than accuracy. For example, one recent study found that people often share content that appears ‘interesting if true’."
 * (Van der Linden (2020) ) "More generally, the body of research on ‘spreading’ has faced significant limitations, including critical gaps in knowledge. There is skepticism about the rate at which people exposed to misinformation begin to actually believe it because research on media and persuasion effects has shown that it is difficult to persuade people using traditional advertisements. But existing research has often used contrived laboratory designs that may not sufficiently represent the environment in which people make news-sharing decisions. For example, studies often test one-off exposures to a single message rather than persuasion as a function of repeated exposure to misinformation from diverse social and traditional media sources. Accordingly, we need a better understanding of the frequency and intensity with which exposure to misinformation ultimately leads to persuasion. Most studies also rely on publicly available data that people have shared or clicked on, but people may be exposed and influenced by much more information while scrolling on their social-media feed45. Moreover, fake news is often conceptualized as a list of URLs that were fact-checked as true or false, but this type of fake news represents only a small segment of misinformation; people may be much more likely to encounter content that is misleading or manipulative without being overtly false. Finally, micro-targeting efforts have significantly enhanced the ability for misinformation producers to identify and target subpopulations of individuals who are most susceptible to persuasion83. In short, more research is needed before precise and valid conclusions can be made about either population-level exposure or the probability that exposure to misinformation leads to infection (that is, persuasion)."
 * (Vosoughi, S. et al. (2018)) "Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information. We found that false news was more novel than true news, which suggests that people were more likely to share novel information. Whereas false stories inspired fear, disgust, and surprise in replies, true stories inspired anticipation, sadness, joy, and trust. Contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that false news spreads more than the truth because humans, not robots, are more likely to spread it"
 * (Kai Kupferschmidt on Twitter on the Vosoughi publication and follow-ups) "In 2018, Science published a piece by @sinanaral, @dkroy and @CrashTheMod3 that showed that false news spreads “farther, faster, deeper, and more broadly than the truth”. ... The result was often communicated as a very general “lies spread faster than the truth". But that's a bit of an overstatement. The Science paper specifically looked at news that had been fact-checked by sites like Snopes, so news that had gotten enough attention to warrant that. The authors checked their results on a larger news sample that was not fact-checked and got the same result. But generalising to all fake news is still an inference. ... The conclusions of that paper were important for many reasons. One interesting implication was that it might be possible to pinpoint misinformation automatically, simply by the pattern of its “farther, faster, deeper, broader” spread. So far so good. Then late last year, @jugander and @jonassjuul published a paper in PNAS looking at the same data . Now, I originally wrote in my Science piece that this analysis showed that when accounting for the size of these news cascades, the effect of fake news spreading faster, disappeared. But that is wrong. That was me misreading the paper. What the PNAS paper actually showed was that fake news in the sample really does spread faster, farther, deeper and broader. But the “faster, deeper and broader” part is down to the farther part. So what looks like a different pattern is really a difference in one thing: size. That means that this pattern alone is probably not all that helpful in pinpointing fake news because it essentially just one characteristic (it could be one feature of several to use in automatic detection of misinformation, of course) But the important thing is that the PNAS paper actually confirms the basic conclusion of the Science paper rather than contradicting it. Apologies to the authors of both papers for getting this wrong." See also Daniel Engber on Twitter.
 * (Lewandowsky, S. et al. (2021) ) "Human culture strongly depends on people passing on information. Although the believability of information has been identified as a factor determining whether it is propagated (Cotter, 2008), people seem to mainly pass on information that will evoke an emotional response in the recipient, irrespective of the information’s truth value. Emotional arousal in general increases people’s willingness to pass on information (Berger, 2011). Thus, stories containing content likely to evoke disgust, fear, or happiness are spread more readily from person to person and more widely through social media than are neutral stories (Cotter, 2008; Heath, Bell, & Sternberg, 2001; K. Peters, Kashima, & Clark, 2009)."
 * (Van Prooijen, J. et al. (2021)) "We conclude that one reason why people believe conspiracy theories is because they find them entertaining."
 * (Popoli, G., Longus, A. (2021)) "Although more research is needed in this area, the present study did support the hypothesis that there are, in fact, significant gender differences within each facet. For our sample, women scored significantly higher than males when it came to government malfeasance, malevolent global conspiracies, extraterrestrial cover-up, personal well-being, and control of information."
 * (Jankowicz, N. (2020)) “As in the case of the US election, it is hard to determine if voters were affected by the widespread disinformation campaigns that occurred during the Czech [2018 Presidential] Election.”
 * (Lyons, B. et al. (2021)) "Our results suggest that overconfidence may be a crucial factor for explaining how false and low-quality information spreads via social media."
 * (Sirlin, N. et al. (nd)) Individuals with high digital literacy are less likely to fall for fake news, but no less willing to share those.
 * (Altay, S. (2021)) People may not share fake news by mistake, but because they think it would be ‘interesting-if-true’.
 * (Solovev & Proellochs (2022) Summary by Jay van Bavel on Twitter: "Moral Emotions Shape the Virality of #COVID19 Misinformation ... An analysis of 10,610 rumor cascades retweeted >24 million times finds that COVID-19 misinformation is more likely to go viral than facts because it contains more other-condemning emotions."
 * Older people more likely to share fake news on Facebook, study finds
 * (PEW Research) "About two-thirds of Americans (64%) say social media have a mostly negative effect on the way things are going in the country today ... When asked to elaborate on the main reason why they think social media have a mostly negative effect on the way things are going in this country today, roughly three-in-ten (28%) respondents who hold that view mention the spreading of misinformation and made-up news. Smaller shares reference examples of hate, harassment, conflict and extremism (16%) as a main reason, and 11% mention a perceived lack of critical thinking skills among many users".
 * (Ecker, U. et al. (2022)) "Misleading content that spreads quickly and widely (‘virally’) on the internet often contains appeals to emotion, which can increase persuasion. ... Moreover, according to a preprint that has not been peer-reviewed, ‘happy thoughts’ are more believable than neutral ones . Emotion can be persuasive because it distracts readers from potentially more diagnostic cues, such as source credibility."
 * (Ecker, U. et al. (2022)) "On social media, sharing is often dictated by what captures attention. Moral-emotional words such as ‘fight’, ‘greed’, ‘evil’ and ‘punish’ are prioritized in early visual attention over other arousing words and also lead to increased sharing."
 * (Ecker, U. et al. (2022)) "‘Lazy’ or intuitive thinking can also lead people to share content that they might recognize as false if they thought about it more."
 * (Ecker, U. et al. (2022)) "... asking people to explain how they know that news headlines are true or false reduces sharing of false political headlines, and brief accuracy nudges — simple interventions that prompt people to consider the accuracy of the information they encounter or share — can reduce sharing of false news about politics207 and COVID-19 ."
 * (Ecker, U. et al. (2022)) "Most people report that they would need to be paid to share false news; even when stories favour their political views, they worry about possible reputation costs from sharing false news. Those reputation costs are real — over half of social media users report that they have stopped following someone who posted ‘made-up news and information’."
 * (Ecker, U. et al. (2022)) "There are some innocuous reasons to intentionally spread falsehoods; for example, it is tempting to share information that would be ‘interesting (or consequential) if true’. Likewise, findings from a preprint that has not been peer-reviewed suggest that people might share positive but questionable claims that could make others feel better ... There are also self-serving motives for sharing, such as to signal group membership or for self-promotion. Finally, some people share misinformation to fuel moral outrage in others."
 * (Ecker, U. et al. (2022)) "With these alternative goals in mind, the viral nature of misinformation does not occur despite its low veracity but because of its ability to fulfil other psychological needs."
 * (European Union) Statistics on individuals who have encountered misinformation
 * (The New York Times (2021)) Case study on the reception of the video 'Loose change'.
 * Disinformation spreaders flock to Substack.
 * (Sternisko, A. et al (2021) ) "... we found a robust, positive relationship between national narcissism and proneness to believe and disseminate conspiracy theories related to COVID-19. Furthermore, belief in COVID-19 conspiracy theories was related to less engagement in health behaviors and less support for public-health policies to combat COVID-19. Our findings illustrate the importance of social identity factors in the spread of conspiracy theories and provide insights into the psychological processes underlying the COVID-19 pandemic."
 * (Hansen-Staszyński (2022) ) Authentic dissemination by individuals is described in terms of responsibility and a choice: "Either we are will-less victims of evil elites without any hope on exerting influence on what is happening around us. Or we are critical citizens who are responsible participants in our society. We can’t have our cake and eat it too: we can’t pose as powerless victims and at the same time try to steer others into a destructive direction. And we can’t pose as responsible citizens and just publish whatever our gut dictates us."
 * (KRRiT (nd)) "Z raportu Dezinformacja w sieci. Analiza wiarygodności kanałów informacyjnych przygotowanego przez IAB Polska i opublikowanego w lipcu 2018 r. wynika, że głównym źródłem nieprawdziwych informacji z kraju i ze świata są media społecznościowe. Ponad połowa internautów, zapytanych o środowiska internetowe, gdzie najczęściej zdarzało im się natrafić na nieprawdziwe informacje, wskazywała media społecznościowe. Były one wskazywane przez 58% pytanych, drugie miejsce (39%) zajęły portale informacyjne."
 * (Yasha Mounk on Twitter)
 * Maybe there is a relationship with dissemination of disinformation as a form of grandstanding. For grandstanding see Tosi & Warmke (2020).

Recommendations approaches to addressing the amplification of misinformation may be more effective. ... In addition, demonstrating a causal link between online misinformation and offline harm is difficult to achieve, and there is a risk that content removal may cause more harm than good by driving misinformation content (and people who may act upon it) towards harder-to-address corners of the internet. ... Furthermore, removing content may exacerbate feelings of distrust and be exploited by others to promote misinformation content. Finally, misinformation sometimes comes from domestic political actors, civil society groups, or individual citizens who may, in good faith, believe in the content they are spreading, even if it may be harmful to others. It is clear that they may well regard direct action against their expression as outright censorship."
 * (The Royal Society (2022)) "Governments and social media platforms should not rely on content removal as a solution to online scientific misinformation. ... there is little evidence to support the effectiveness of this approach for scientific misinformation, and
 * (Belgian Senate (2021)) "De Senaat beveelt aan de socialemediaplatformen ertoe aan te zetten het bereik in te perken van een bericht dat na factchecking als desinformatie gekenmerkt is, en duidelijk aan te geven dat het bericht in kwestie desinformatie is."

Dissemination projects