Three kinds of propaganda, and what to do about them
Jonathan Stray summarizes three different strains of propaganda, analyzing why they work, and suggesting counter-tactics: in Russia, it’s about flooding the channel with a mix of lies and truth, crowding out other stories; in China, it’s about suffocating arguments with happy-talk distractions, and for trolls like Milo Yiannopoulos, it’s weaponizing hate, outraging people so they spread your message to the small, diffused minority of broken people who welcome your message and would otherwise be uneconomical to reach.
Stray cites some of the same sources I’ve written about here: Tucker Max’s analysis of Yiannopoulos’s weaponized hate and The Harvard Institute for Quantitative Science team’s first-of-its kind analysis of leaked messages directing the activities of the “50-cent army, which overwhelms online Chinese conversation with upbeat cheerleading (think of Animal Farm’s sheep-bleating, or Nineteen Eighty-Four’s quackspeak).
But I’d never encountered the work he references on Russian propaganda, by RAND scholar Christopher Paul, who calls Russian disinformation a “firehose of falsehood.” This tactic involves having huge numbers of channels at your disposal: fake and real social media accounts, tactical leaks to journalists, state media channels like RT, which are able to convey narrative at higher volume than the counternarrative, which becomes compelling just by dint of being everywhere (“quantity does indeed have a quality all its own”).
Mixing outright lies with a large dollop of truth is key to this tactic, as it surrounds the lies with a penumbra of truthfulness. This is a time-honored tactic, of course: think of the Christian Science Monitor’s history of outstanding international coverage, accompanied by editorials about God’s ability to heal through prayer; or Voice of America’s mixture of excellent reporting on (again) international politics and glaring silence on US crises (see also: Al Jazeera as a reliable source on everything except corruption in the UAE; the BBC World Service’s top-notch journalism on everything except UK complicity in disasters like the Gulf War, etc).
In addition to this excellent taxonomy of propaganda, Stray proposes countermeasures for each strain: for Russia-style “firehoses of falsehood,” you have to reach the audience first with an alternative narrative; once the firehose is on, it’s too late. For Chinese quackspeak floods, you need “organized, visible resistance” in the streets. For pathetic attention-whores like Yiannopoulos, Stray says Tucker Max is right: you have to ignore him.
As I’ve written before, we’re not living through a crisis about what is true, we’re living through a crisis about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology. The “establishment” version of epistemology is, “We use evidence to arrive at the truth, vetted by independent verification (but trust us when we tell you that it’s all been independently verified by people who were properly skeptical and not the bosom buddies of the people they were supposed to be fact-checking).”
The “alternative facts” epistemological method goes like this: “The ‘independent’ experts who were supposed to be verifying the ‘evidence-based’ truth were actually in bed with the people they were supposed to be fact-checking. In the end, it’s all a matter of faith, then: you either have faith that ‘their’ experts are being truthful, or you have faith that we are. Ask your gut, what version feels more truthful?”
One of the most insightful things I’ve heard about the epistemological crisis came from a recent episode of the hilarious News Quiz on BBC Radio 4: the people who support Trump do so tribally, like supporters of a sports team. If you hear that your sports team had three players thrown out of the game for breaking the rules and still won, you don’t rail against their cheating, you celebrate their victory in the face of the odds.
https://boingboing.net/2017/02/25/counternarratives-not-fact-che.html