Social Psychology

Internal code: 5.1 Target groups: recipients

General description
 * (Lewandowsky, S. et al. (2021) ) "It is a truism that a functioning democracy relies on an educated and well-informed populace (Kuklinski, Quirk, Jerit, Schwieder, & Rich, 2000). The processes by which people form their opinions and beliefs are therefore of obvious public interest, particularly if major streams of beliefs persist that are in opposition to established facts. If a majority believes in something that is factually incorrect, the misinformation may form the basis for political and societal decisions that run counter to a society’s best interest; if individuals are misinformed, they may likewise make decisions for themselves and their families that are not in their best interest and can have serious consequences."
 * (Sandel, M. (2021)) The idea that people have different opinions because they are misinformed is technocratic.

Interventions

Impact measurements

Assumptions Origins Censoring Mechanisms
 * Social psychological roots of post-truth. Cognitive bias. Leon Festinger: A theory of cognitive dissonance. 1957. Peer pressure. Solomon Asch: Opinions and social pressure. 1955. Peter Cathcart Wason: On the failure to eliminate hypotheses in a conceptual task. 1960. The backfire effect. Brendan Nyhan & Jason Reifler. - McIntyre, L. (2018)
 * (Lewandowsky, S. et al. (2021) ) "Reliance on misinformation differs from ignorance, which we define as the absence of relevant knowledge. Ignorance, too, can have obvious detrimental effects on decision making, but, perhaps surprisingly, those effects may be less severe than those arising from reliance on misinformation. Ignorance may be a lesser evil because in the self-acknowledged absence of knowledge, people often turn to simple heuristics when making decisions. Those heuristics, in turn, can work surprisingly well, at least under favorable conditions. For example, mere familiarity with an object often permits people to make accurate guesses about it (Goldstein & Gigerenzer, 2002; Newell & Fernandez, 2006). Moreover, people typically have relatively low levels of confidence in decisions made solely on the basis of such heuristics (De Neys, Cromheeke, & Osman, 2011; Glöckner & Bröder, 2011). In other words, ignorance rarely leads to strong support for a cause, in contrast to false beliefs based on misinformation, which are often held strongly and with (perhaps infectious) conviction."
 * For moral reasoning & moral judgments - see Haidt, J. (2013)
 * (Lewandowsky, S. et al. (2021) ) "Rumors and urban myths constitute important sources of misinformation."
 * (Lewandowsky, S. et al. (2021) ) "A related but perhaps more surprising source of misinformation is literary fiction. People extract knowledge even from sources that are explicitly identified as fictional. This process is often adaptive, because fiction frequently contains valid information about the world."
 * (Lewandowsky, S. et al. (2021) ) Other sources of disinformation: governments and politicians. NGOs.
 * (Lewandowsky, S. et al. (2021) ) "... the media sometimes unavoidably report incorrect information because of the need for timely news coverage. There are, however, several other systemic reasons for why the media might get things wrong. First, the media can inadvertently oversimplify, misrepresent, or overdramatize scientific results. Science is complex, and for the layperson, the details of many scientific studies are difficult to understand or of marginal interest. Science communication therefore requires simplification in order to be effective. Any oversimplification, however, can lead to misunderstanding. ... Second, in all areas of reporting, journalists often aim to present a “balanced” story. In many instances, it is indeed appropriate to listen to both sides of a story; however, if media stick to journalistic principles of “balance” even when it is not warranted, the outcome can be highly misleading (Clarke, 2008)."
 * (Lewandowsky, S. et al. (2021) ) "The Internet has revolutionized the availability of information; however, it has also facilitated the spread of misinformation because it obviates the use of conventional “gate-keeping” mechanisms, such as professional editors. This is particularly the case with the development of Web 2.0, whereby Internet users have moved from being passive consumers of information to actively creating content on Web sites such as Twitter and YouTube or blogs. People who use new media, such as blogs (McCracken, 2011), to source their news report that they find them fairer, more credible, and more in-depth than traditional sources (T. J. Johnson & Kaye, 2004). ... On the other hand, information on the Internet can be highly misleading, and it is progressively replacing expert advice."
 * (Ashukkumar et al. (2020) People selectively censor online content that challenges their political beliefs (deleting 5-12% more content from out-groups). When beliefs are rooted in identity, selective censoring is amplified - even when content is inoffensive. (summary by Jay van Bavel on Twitter).
 * (Ashukkumar et al. (2020) As ordinary citizens increasingly moderate online forums, blogs, and their own social media feeds, a new type of censoring has emerged wherein people selectively remove opposing political viewpoints from online contexts. In three studies of behavior on putative online forums, supporters of a political cause (e.g., abortion or gun rights) preferentially censored comments that opposed their cause. The tendency to selectively censor cause-incongruent online content was amplified among people whose cause-related beliefs were deeply rooted in or “fused with” their identities. Moreover, six additional identity-related measures also amplified the selective censoring effect. Finally, selective censoring emerged even when opposing comments were inoffensive and courteous. We suggest that because online censorship enacted by moderators can skew online content consumed by millions of users, it can systematically disrupt democratic dialogue and subvert social harmony. Keywords: Censorship; Selective censoring; Identity politics; Moderators; Identity fusion; Social media)
 * (Lewandowsky, S. et al. (2021) ) "The growth of cable TV, talk radio, and the Internet have made it easier for people to find news sources that support their existing views, a phenomenon known as selective exposure (Prior, 2003). When people have more media options to choose from, they are more biased toward like-minded media sources. The emergence of the Internet in particular has led to a fractionation of the information landscape into “echo chambers”—that is, (political) blogs that primarily link to other blogs of similar persuasion and not to those with opposing viewpoints. More than half of blog readers seek out blogs that support their views, whereas only 22% seek out blogs espousing opposing views, a phenomenon that has led to the creation of “cyber-ghettos” (T. J. Johnson, Bichard, & Zhang, 2009). These cyber-ghettos have been identified as one reason for the increasing polarization of political discourse (McCright, 2011; Stroud, 2010)."
 * (Lewandowsky, S. et al. (2021) ) "Misleading information rarely comes with a warning label. People usually cannot recognize that a piece of information is incorrect until they receive a correction or retraction. For better or worse, the acceptance of information as true is favored by tacit norms of everyday conversational conduct: Information relayed in conversation comes with a “guarantee of relevance” (Sperber & Wilson, 1986), and listeners proceed on the assumption that speakers try to be truthful, relevant, and clear, unless evidence to the contrary calls this default into question (Grice, 1975; Schwarz, 1994, 1996). Some research has even suggested that to comprehend a statement, people must at least temporarily accept it as true (Gilbert, 1991). On this view, belief is an inevitable consequence of—or, indeed, precursor to—comprehension. Although suspension of belief is possible (Hasson, Simmons, & Todorov, 2005; Schul, Mayo, & Burnstein, 2008), it seems to require a high degree of attention, considerable implausibility of the message, or high levels of distrust at the time the message is received. So, in most situations, the deck is stacked in favor of accepting information rather than rejecting it, provided there are no salient markers that call the speaker’s intention of cooperative conversation into question. Going beyond this default of acceptance requires additional motivation and cognitive resources: If the topic is not very important to you, or you have other things on your mind, misinformation will likely slip in. When people do thoughtfully evaluate the truth value of information, they are likely to attend to a limited set of features. First, is this information compatible with other things I believe to be true? Second, is this information internally coherent?—do the pieces form a plausible story? Third, does it come from a credible source? Fourth, do other people believe it? These questions can be answered on the basis of declarative or experiential information—that is, by drawing on one’s knowledge or by relying on feelings of familiarity and fluency (Schwarz, 2004; Schwarz, Sanna, Skurnik, & Yoon, 2007)."
 * (Lewandowsky, S. et al. (2021) ) "As numerous studies in the literature on social judgment and persuasion have shown, information is more likely to be accepted by people when it is consistent with other things they assume to be true (for reviews, see McGuire, 1972; Wyer, 1974). People assess the logical compatibility of the information with other facts and beliefs. Once a new piece of knowledge-consistent information has been accepted, it is highly resistant to change, and the more so the larger the compatible knowledge base is. From a judgment perspective, this resistance derives from the large amount of supporting evidence (Wyer, 1974); from a cognitive-consistency perspective (Festinger, 1957), it derives from the numerous downstream inconsistencies that would arise from rejecting the prior information as false. Accordingly, compatibility with other knowledge increases the likelihood that misleading information will be accepted, and decreases the likelihood that it will be successfully corrected."
 * (Lewandowsky, S. et al. (2021) ) "Whether a given piece of information will be accepted as true also depends on how well it fits a broader story that lends sense and coherence to its individual elements. People are particularly likely to use an assessment strategy based on this principle when the meaning of one piece of information cannot be assessed in isolation because it depends on other, related pieces; use of this strategy has been observed in basic research on mental models (for a review, see Johnson-Laird, 2012), as well as extensive analyses of juries’ decision making (Pennington & Hastie, 1992, 1993)."
 * (Lewandowsky, S. et al. (2021) ) "When people lack the motivation, opportunity, or expertise to process a message in sufficient detail, they can resort to an assessment of the communicator’s credibility. Not surprisingly, the persuasiveness of a message increases with the communicator’s perceived credibility and expertise (for reviews, see Eagly & Chaiken, 1993; Petty & Cacioppo, 1986). However, even untrustworthy sources are often influential. Several factors contribute to this observation. People are often insensitive to contextual cues that bear on the credibility of a source. For example, expert testimony has been found to be similarly persuasive whether it is provided under oath or in another context (Nyhan, 2011). Similarly, Cho, Martens, Kim, and Rodrigue (2011) found that messages denying climate change were similarly influential whether recipients were told they came from a study “funded by Exxon” or from a study “funded from donations by people like you.” Such findings suggest that situational indicators of credibility may often go unnoticed, consistent with people’s tendency to focus on features of the actor rather than the situation (Ross, 1977). In addition, the gist of a message is often more memorable than its source, and an engaging story from an untrustworthy source may be remembered and accepted long after the source has been forgotten (for a review of such “sleeper effects,” see Eagly & Chaiken, 1993)."
 * (Lewandowsky, S. et al. (2021) ) "Repeated exposure to a statement is known to increase its acceptance as true (e.g., Begg, Anas, & Farinacci, 1992; Hasher, Goldstein, & Toppino, 1977). In a classic study of rumor transmission, Allport and Lepkin (1945) observed that the strongest predictor of belief in wartime rumors was simple repetition. Repetition effects may create a perceived social consensus even when no consensus exists. Festinger (1954) referred to social consensus as a “secondary reality test”: If many people believe a piece of information, there’s probably something to it. Because people are more frequently exposed to widely shared beliefs than to highly idiosyncratic ones, the familiarity of a belief is often a valid indicator of social consensus. But, unfortunately, information can seem familiar for the wrong reason, leading to erroneous perceptions of high consensus. For example, Weaver, Garcia, Schwarz, and Miller (2007) exposed participants to multiple iterations of the same statement, provided by the same communicator. When later asked to estimate how widely the conveyed belief is shared, participants estimated consensus to be greater the more often they had read the identical statement from the same, single source. In a very real sense, a single repetitive voice can sound like a chorus."
 * (Ecker, U. et al. (2022)) "The formation of false beliefs all but requires exposure to false information. However, lack of access to high-quality information is not necessarily the primary precursor to false-belief formation; a range of cognitive, social and affective factors influence the formation of false beliefs. False beliefs generally arise through the same mechanisms that establish accurate beliefs. When deciding what is true, people are often biased to believe in the validity of information30, and ‘go with their gut’ and intuitions instead of deliberating."
 * (Ecker, U. et al. (2022)) "... simply repeating a claim makes it more believable than presenting it only once. This illusory truth effect arises because people use peripheral cues such as familiarity (a signal that a message has been encountered before), processing fluency (a signal that a message is either encoded or retrieved effortlessly) and cohesion (a signal that the elements of a message have references in memory that are internally consistent) as signals for truth, and the strength of these cues increases with repetition. Thus, repetition increases belief in both misinformation and facts. Illusory truth can persist months after first exposure, regardless of cognitive ability and despite contradictory advice from an accurate source46 or accurate prior knowledge."
 * (Ecker, U. et al. (2022)) "Some of the main cognitive [intuitive thinking - lack of analytical thinking and/ or deliberation; cognitive failures - neglect source cues and/ or knowledge, forger source and/ or counter-evidence; ilusory truth - familiarity, fluency, cohesion] and socio-affective [source cues - elite, in-group, attractive; emotion - emotive information, emotional state; worldview - personal views, partisanship] factors that can facilitate the formation of false beliefs when individuals are exposed to misinformation. Not all factors will always be relevant, but multiple factors often contribute to false beliefs."
 * (Ecker, U. et al. (2022)) "In general, messages are more persuasive and seem more true when they come from sources perceived to be credible rather than non-credible. People trust human information sources more if they perceive the source as attractive, powerful and similar to themselves. These source judgements are naturally imperfect — people believe in-group members more than out-group members, tend to weigh opinions equally regardless of the competence of those expressing them and overestimate how much their beliefs overlap with other people’s, which can lead to the perception of a false consensus. Experts and political elites are trusted by many and have the power to shape public perceptions; therefore, it can be especially damaging when leaders make false claims."
 * (Ecker, U. et al. (2022)) "... when misinformation downplays a risk or threat (for example, misinformation that a serious disease is relatively harmless), corrections that provide a more accurate risk evaluation operate partly through their impact on emotions such as hope, anger and fear. This emotional mechanism might help correction recipients realign their understanding of the situation with reality ... Likewise, countering disinformation that seeks to fuel fear or anger can benefit from a downward adjustment of emotional arousal".
 * (Orticio, E. et al. (2021) "We rely heavily on information from the social world to inform our real-world beliefs. ... we show that increases in people’s estimates of the prevalence of a belief led to increases in their endorsement of said belief. Prevalence information elicited the strongest belief change when people were most uncertain of their initial belief, suggesting that people weigh social information rationally according to the strength of their initial evidence."
 * (CNN) Alex Jones blames his belief that "basically ... everything was staged" on "almost ... like a form of psychosis".
 * (Van der Linden (2020) ) "Although people use many cognitive heuristics to make judgments about the veracity of a claim (for example, perceived source credibility), one particularly prominent finding that helps explain why people are susceptible to misinformation is known as the ‘illusory truth’ effect: repeated claims are more likely be judged as true than non-repeated (or novel) claims. Given the fact that many falsehoods are often repeated by the popular media, politicians, and social-media influencers, the relevance of illusory truth has increased substantially."
 * (Van der Linden (2020) ) "Although illusory truth can affect everyone, research has noted that some people are still more susceptible to misinformation than others. For example, some common findings include the observation that older individuals are more susceptible to fake news, potentially owing to factors such as cognitive decline and greater digital illiteracy35, although there are exceptions: in the context of COVID-19, older individuals appear less likely to endorse misinformation4. Those with a more extreme and right-wing political orientation have also consistently shown to be more susceptible to misinformation, even when the misinformation in question is non-political. Yet, the link between ideology and misinformation susceptibility is not always consistent across different cultures. Other factors such as greater numeracy skills4 and cognitive and analytic thinking styles have consistently been revealed to have a negative correlation with misinformation susceptibility—although other scholars have identified partisanship as a potential moderating factor. In fact, these individual differences have given rise to two competing overarching theoretical explanations for why people are susceptible to misinformation. The first theory is often referred to as the classical ‘inattention’ account; the second is often dubbed the ‘identity-protective’ or ‘motivated cognition’ account."
 * (Van der Linden (2020) ) "The inattention or ‘classical reasoning’ account argues that people are committed to sharing accurate content but the context of social media simply distracts people from making news-sharing decisions that are based on a preference for accuracy."
 * (Van der Linden (2020) ) "Motivated reasoning occurs when someone starts out their reasoning process with a pre-determined goal (for example, someone might want to believe that vaccines are unsafe because that belief is shared by their family members), so individuals interpret new (mis)information in service of reaching that goal. The motivated account therefore argues that the types of commitments that people have to their affinity groups is what leads them to selectively endorse media content that reinforces deeply held political, religious, or social identities. There are several variants of the politically motivated reasoning account, but the basic premise is that people pay attention to not just the accuracy of a piece of news content, but also the goals that such information may serve."
 * (Jay van Bavel on the upcoming Fox news effect paper on Twitter) "partisan media’s viewers are not fully aware of its bias, and that it changes even strong partisans’ basic factual beliefs, issue preferences, and evaluations of elected officials."
 * (Imhoff, R. et al. (2022)) "We conclude that conspiracy mentality is associated with extreme left- and especially extreme right-wing beliefs, and that this non-linear relation may be strengthened by, but is not reducible to, deprivation of political control."
 * (Anni Sternisko on YouTube) Psychological Pathways to Conspiracy Theories
 * (Rolf Degen on Costello, T. & Bowes, S. (2022) on Twitter) "Individuals who identify themselves as “extremely left-wing” or “extremely right-wing” are roughly five times more likely than all others to be 100% certain that their political beliefs are correct."
 * (Rolf Degen on Wiliams, M. et al. (2022) on Twitter) "The stability of beliefs in conspiracy theories is comparable to or higher than some of the most stable psychological attributes."
 * (Lewandowsky, S., Cook, J. (2020)) "Why are conspiracy theories popular?"
 * -"Feeling of powerlessness People who feel powerless or vulnerable are more likely to endorse and spread conspiracy theories."
 * -"Explaining unlikely events ... people tend to propose conspiratorial explanations for events that are highly unlikely. Conspiracy theories act as a coping mechanism to help people handle uncertainty."
 * -"Coping with threats ... A conspiracy theory satisfied the need for a "big" event to have a big cause..."
 * -"Disputing mainstream politics Conspiracy theories are used to dispute mainstream political interpretations."


 * An alternative/ additional view can be found under Basic Needs Assumptions. Both views reject Fact-Checking/ Report Disinformation/ Quality Label Assumptions that cognitive grounds are sufficient for recipients to embrace misinformation/ disinformation.Dissemination is a different step.


 * Individual cognition is tied to groups - Sloman, S. & Fernbach, P. (2017) Therefore, the most effective way to combat misinformation seems group-based - Higgins, E. (2022)
 * Social identities - Van Bavel, J. & Packer, D. (2021)
 * Unfulfilled Basic Needs - Alexander, B. (2008), Van der Kolk, B. (2014)
 * Beyond Learned Helplessness:
 * - Virtuous victim signaling as a form of Disinfonomics - Ok, E. et al. (2021)
 * - Change in conspiratorial targets from veiled government and elite figures to everyday people

Recommendations

Social Psychology Projects