Jankowicz, N. (2020)

Jankowicz, N. (2020). How to Lose the Information War: Russia, Fake News, and the Future of Conflict. Bloomsbury Publishing.

(xii) “Despite efforts by the World Health Organization, the United Nations and some proactive national governments, the coronavirus pandemic soon became an “infodemic” as well.”

(xiv) “The algorithmic recommendations on which social media platforms thrive ... had always been vectors for indoctrination and extremism.”

(xvii) “... well before the fertile climate of the 2020’s disinformation took hold, it was our own internal fissures and our inaction that allowed false narratives to begin to spread.”

(xxv) “Unlike Soviet propaganda, which sought to promote a specific, communist-centric worldview, the Kremlin divides and deceives populations around the world with one goal in mind: the destruction of Western democracy as we know it. Russian deceptions exploit fissures in targeted societies to sow doubt, distrust, discontent and to further divide populations and their governments. The ultimate goal is to undermine democracy ... and drive citizens to disengage.” When democracies have “trouble functioning” Putin gets a seat at the negotiating table. And Putin gains a comparative advantage. He can state to his domestic critics that democracies are also having troubles.

(xxvii) “... we need to clearly define and categorize these phenomena if we are to successfully understand and counter them.”

(xxvii) “All of the tactics Russia employs to angle for international notoriety can be categorized as “influence operations.””

(xxvii) “the category of disinformation – “when false information is knowingly shared to cause harm” – or malinformation – “when genuine information is shared to cause harm, often by moving information designed to stay private into the public sphere.””

(xxviii) “In fact, fake news encompasses just a sliver of Russian influence operations. The most convincing Russian narratives, and indeed, the most successful, in both Central and Eastern Europe and in the United States, are narratives grounded in truth that exploit the division in societies. These truths can be undisputed facts or the perceived realities of life for marginalized populations that Russian operations target.”

(xxxv) “Keeping people at the heart of Western policy on the Kremlin’s influence campaigns is critically not only in responding to Russia’s online offensives, but in repairing the cracks in our democracies that allowed them to begin in the first place.”

(3) “What makes the information war so difficult to win is not just the online tools that amplify and target its messages or the adversary that is sending them; it’s the fact that those messages are often unwittingly delivered not by trolls or bots, but by authentic local voices.”

(4) Is social media to blame? “But social media relies on user engagement with content to function, and Americans were quite happy to interact with and share sensationalist, divisive, and unfounded information they saw during the course of the election and beyond. What they didn’t know is that they were sharing content produced and posted from accounts run out of Russia’s Internet Research Agency [troll farm].”

(6) “Through this generally apolitical content, the Internet Research Agency built users’ trust in the pages, increased engagement, and grew their followings to hundreds of thousands of people each.”

(6) “Whether on Facebook, Twitter, Instagram, or beyond, these communities were built on emotion, not through the purchase of online advertising.”

(7) “As trust within their inauthentic communities grew, the IRA operatives’ asks of community members did as well. They began by initiating armchair activist campaigns based on easy tasks that furthered the sense of community in the group.”

(7) “The IRA also purchased 3,500 advertisements on Facebook to boost engagement and reach on their posts.”

(9) “The organic Facebook posts in the dataset “were shared by users just under 31 million times, liked almost 39 million times, reacted to with emojis almost 5.4 billion times, and ... generat[ed] almost 3.5 million comments.” [Oxford Internet Institute research on a dataset the Senate Select Committee on Intelligence provided them]

(16) Ryan Clayton on supporting extremes. “... if you can weight the side, you can really pull at the fabric of society. You can pull it apart.”

(18) “... unless we recognize and address the areas that make our societies vulnerable to Russian – and other foreign – manipulation in the first place, we will never be able to address the problem.”

(19) “Rather than approaching the problem holistically, considering how to both address root causes and mitigate immediate effects, we do what’s easy: we play a never-ending game of What-a-Troll, deleting offending accounts that keep cropping up like mechanical creatures in a carnival game across social media platforms. We blame technology companies for their inaction. We blame lawmakers for their technological ineptitude.”

(50) “One thing that’s clear from Estonia’s experience is that simply making policy without engaging [disenfranchised] communities, or lecturing them that the authentic feelings Russia has exploited to manipulate them are somehow incorrect, won’t generate trust and won’t build a new identity in which all Americans can take pride.”

(76) Georgia. “It may sound ridiculous but this “gateway tactic” is one of the most important in the Russian disinformation arsenal and the reason RT can claim to be the most-watched news channel on the Internet. Rather than posting videos only about their reporting, the RT YouTube channel is filled with so-called disaster porn – videos of mass destruction such as the 2011 Tsunami in Japan – and instantly viral videos of cute animals. This allows the channel to build viewership and trust so that when the RT logo is emblazoned on a dubious news story or editorial take, viewers are more likely to give it a chance.”

(78) Batu Kutelia. “”Even today ... you hear some people say that culture is not politics. Everything is politics. Everything has been weaponized. That’s the Russian strategy. That’s part of hybrid warfare.”

(79) Nina Nakashidze. “We’ve come to the conclusion that ‘infotainment’ works much better than just simple talk shows where people are preaching and teaching,””

(90) “The Poland I visit today is fractured, partisan, and petty. It is gripped by conspiracy theories or else obsessed with disproving them.”

(90) “... the Smolensk disaster, an unspeakable tragedy that should have inspired cohesion and unity, has instead been used to drive division and a pandemic of partisanship in the country.”

(93) MFA official. ““Russia is retaining the wreckage to heat the internal debate in Poland.””

(94) “... Kaczyński embrace of a conspiracy theory he lifted from the “defenders of the cross.””

(97) Wojciech Kość. “...”conspiracies were marginal” until / the crash was politicized.”

(98) Łukasz Warzecha. “... he recognizes the polarization of Polish television is an obstacle to fighting disinformation in the country. But the biggest problem, in Warzecha’s opinion. is people. “How many people consciously look for unbiased news?” he asks. “If they watched [the public broadcaster] during the PO days,” when the current opposition party was in power, “they still do.” Habits reign supreme.”

(99) “... Poland’s approach to the problem of disinformation: polarized, flippant, lacking substance, and nearly / devoid of forward-looking responses.”

(100) “In order to circumvent Poles’ disinclination toward overly Russian sources, disinformation in Poland is exercised through calculated decisions, like the retention of the Smolensk wreckage, fueling partisan rancor in Poland. It also seizes upon the most delicate, destructive, and difficult elements of Polish discourse. And in large part, Polish actors drive it.”

(102) InfoOps. “... while these narratives begin on blatantly Russian websites such as Sputnik, they are amplified by the homegrown Polish media ecosystem. Stories are republished on online forums and by influential bloggers, making the content seem more trustworthy as it is laundered through Polish-language outlets and increasing the sheer number of sources “reporting” a story that initially appeared on a Russian government-sponsored website. The report makes no assertion whether this laundering of information and influence is something Russia deliberately orchestrates or if it happens naturally, either way, the effect is the same. As in Georgia, local voices are spreading divisive narratives and hastening the polarization of Polish society, It’s a phenomenon that plagues the United States as well ...”

(103) Kamil Basaj. “But at its root, Basaj says what Poland needs to address is “the polarization of society. This is ... one of the results of not only / propaganda or disinformation campaigns but the whole system of influence to the information sphere.” Whether they are created in Russia or within the borders of Poland, influence operations have increased polarization in Poland and left the country more vulnerable to disinformation in the future.”

(104/105) PiS polarized on LGBT. Some of the anti-LGBT disinformation “had connections to Russia”. Unclear whether PiS is aware of that.

(107) Grzegorz Rzeczkowski. “... Russia wants to „show foreigners that Poland is crazy,” ...”

(108) Waitergate. According to Rzeczkowski links with Russia. “It’s these connections, along with PiS’s implementation of the very tactics that the Kremlin employs, that make a coherent response to Russian disinformation and influence operations impossible, Rzeczkowski believes.”

(112) “Given that Russia uses domestic actors and homegrown messages to sow division, the problem facing Polish civil servants ... becomes more and more intractable.”

(117) Marek Zagórski. “...”it’s quite easy to influence people’s views, because ... we have a situation in which a majority of social media and internet users are not able to see a difference between an opinion and just news or informative content.”

(120) “... education is undoubtedly vital, but pushing for greater media freedom and a pledge to commit democratic communication tactics – not those that mimic the Russian playbook, parroting conspiracy theories and exploiting societal divisions – would go a long way toward decreasing polarization and increasing trust in government and media throughout the country, making it more resilient to foreign interference.”

(120) PiS. “... by weaponizing conspiracy theories and choking the editorial independence of the public broadcaster, it has deepened societal divisions and thus given Russia the upper hand.”

(126) “The Internet Research Agency, the infamous St. Petersburg troll factory that would change the discourse in the 2016 US election using inauthentic accounts, misleading means, and the mobilization of discontent, tested its tactics in Ukraine before using them on Americans.” [David Patrikarakos, War in 140 characters. Basic books. 2017]

(129) Ukraine referendum in NL. “The Kremlin, ever seeking to exploit societal fissures, especially those that undermine the West, seems to have helped the campaign along, just as it would in the United States later that year.”

(132) “Russian-backed media in the Netherlands, including the Dutch version of Kremlin-funded propaganda networks RT and Sputnik, began pushing out anti-Ukraine narratives to provide fodder for the campaign. The GeinStijl [sic] blog, that led the campaign, greedily lapped them up.”

(138) “... every time [the Ukrainian campaign team] “publish[ed] a good piece on the Ukraine, it [was] followed with [a piece] on corruption or killing stray dogs.”” Suspicion that Russia directed this.

(140) Threatening video released. “Several days before the April 6 referendum, the open-source investigation team at Bellingcat ... exposed that the video was likely created and promoted by the infamous St.Petersburg “troll factory,” the IRA.”

(141) “... a New York Times investigation in early 2017 found that “the most active members of [Van Bommel’s] Ukrainian team were actually from Russia, or from breakaway Russian-speaking regions of Ukraine, and parroted the Kremlin line.”

(144) “... the Dutch security service placed special emphasis on Russia’s malign activities in its 2017 annual report.”

(144)”[Dmytro] Kuleba sees the lack of public attribution to the campaign as part and parcel of the Russian influence playbook. “The most difficult thing was to catch Russia in hand””

(146) “This is how disinformation – whether Russian or domestic, in the Netherlands or elsewhere - functions. It preys on real misgivings, fears, and societal fissures, and heightens emotion, ensuring that reason is overwhelmed.”

(147) “... the perfect encapsulation of how Russian disinformation works: take something that people are already mad about, pollute the information ecosystem, and get them so frustrated they start to distrust institutions and disengage.”

(150) “A press release, no matter how well written, cannot fully correct a salacious story. A fact-check, even if verified beyond a shadow of a doubt, will not convince a conspiracy theorist to give up his fervent speculations.”

(151) “... a Yale University study found that labeling content “disputed” on the platform [Facebook] had little effect on user behaviour. The labels helped only 3.6 percent of those surveyed identify false stories. More worryingly, among supporters of President Trump and those under twenty-six, the study found that the flags did the opposite of their intended goal and made users more likely to believe the content. Finally, the study warned about a truly frightening problem of scale. In what it dubs the “Implied Truth Effect,” the study found that users assumed unlabeled content was accurate, a nearly unsurmountable obstacle in the endless flow of information that is today’s internet.” [Gordon Pennycook et al., The implied truth effect. Management science, January 16, 2020.]

(161) “What gave birth to Czech Republic’s anti-Russian disinformation strategy was, quite simply, Islamophobia.”

(170) Benedikt Vangeli, director of the Center for Terrorism and Hybrid Threats. Origin story. “”Disinformation and propaganda have radicalized half of the population.””

(183) “As in the case of the US election, it is hard to determine if voters were affected by the widespread disinformation campaigns that occurred during the Czech [2018 Presidential] Election.”

(186) “... openness and dialogue ... are the key to repairing root divisions that cause disinformation ...”

(191) “This ... is the ideal outcome for Russia. American democracy – once a shining city on a hill – is weak and crumbling. The debate, dissent, and protest, on which the United States was founded are increasingly foreign concepts. Corruption, once kept in check by an active media and engaged electorate, reaches to the highest levels of government. Consumed by problems at home, the United States is less engaged abroad. And the Kremlin points at the failings of our democratic system to justify repressions and a broader embrace of authoritarianism inside and outside its borders.”

(192) “building a feasible, generational, citizens-based response to the problem”

(192ff) Measures aimed at social media platforms. (195) “They are not enough. ... The government itself ought to be pushing for more action ...”

(197) “The sparse efforts the United States has initiated are disparate and uncoordinated. They are, for the most part, focused on retaliation and short-term prevention and awareness-building, rather than systemic, generational solutions.”

(198) “... it isn’t only foreign ills that plague us. Unless we mitigate our own political polarization, our own internal issues, we will continue to be an easy target for any malign actor ... to manipulate.”

(200) “Although Western countries ... share many fewer cultural ties with Russia than Georgia does, ... less obvious vectors of influence still yield power in our political system.” Lobbying groups. (200/1) Money.

(202) “... those drawn to disinformation and most likely to fall for its falsehoods are not searching for a new narrative; they are searching for a renewed, more responsive form of governance, restored trust in the state and the media, and faith in their futures in countries that have left them behind.”

(202) “We are nit even sure that facts can prevail over disinformation, let alone government-concocted narratives of peace and prosperity.”

(202) “... an ever-declining environment of distrust toward institutions, including government and the media, two critical players in the counter-disinformation space.”

(203) “In Poland, where the Law and Justice government utilized conspiracy theories and fearmongering to gain support in the short term, polarization has skyrocketed in the long term.”

(206) “Perhaps the gravest mistake that all of the countries in this book have made, including and especially the United States, is ignoring the / use of homegrown actors and domestic disinformation to amplify preexisting conflict and discord.”

(209) “We must first begin with addressing social media and its effect on democratic discourse and our societies overall.”

(213) USA. “... the connective tissue between the local and the national or international has been lost.”

(213) “The American information landscape is further complicated by a number of factors, including declining trust in the media.”

(215) “Even with greater investments in quality journalism, people will still need help in navigating the unending flow of information that characterizes the modern media environment ...”

(217) “... studies show education has an impact. Eighteen months after the IREX PROGRAM ...” [still impact] [IREX, Impact study on citizens’ ability to detect disinformation 1,5 years after completing a news media literacy program. 2018]

(218) Media literacy. “The Finnish government equips even its youngest citizens with tools to survive in today’s crowded media environment.”

(220) “Those who understand how government operates in actuality are much less likely to buy into conspiracy theories about it secret cabals.”

(220) “How do we ensure the effects of citizen-based counter-disinformation programs are not only felt when today’s students reach adulthood? We should begin with state employees.”

(221) “.. public libraries can also serve as counter-disinformation educational hubs for people of all ages.”

(222) “... education and awareness alone cannot win the information war. It will take a long-term investment in them to even win a single battle. But in countries where disinformation – emanating both from Russia and domestic sources – has long been a reality of life, empowering people to be active and engaged members of society through investments in the information space and in people themselves is always part of the solution.”

(223) Dream. 2028. “Party leaders, editors-in-chief of the country’s newspapers, major broadcasters, and even the social media platforms convened to sign a Declaration of Truth and Integrity: they would not discuss, cover, or amplify content for which a legitimate source could not be established,”

(228) US Congress “... they choose ... to shout lies through a megaphone, capitalizing o their constituents’ unfamiliarity, ambivalence, or polarization.”

(228) “In large part, the crisis of truth and trust in the United States finds itself in today is one of our own makings. Until our elected officials begin to once again respect the truth, it is up to us – at protests, in the voting booth – to remind them it exists.”