Fake news

First published in the New Philosopher, issue 17: Communication.

“The essence of bullshit,” argued philosopher Harry Frankfurt in his 2005 book On Bullshit, “is not that it is false but that it is phony”. Both a liar and an honest person are interested in the truth – they’re playing on opposite sides in the same game. A bullshitter, however, has no such constraint.

Imagine a politician who claims to have witnessed something that did not, in fact, happen: thousands of people in areas of New Jersey with a heavy Arab population cheering as the World Trade Center came down, for example, on September 11, 2001. Now imagine an interviewer confronting him with evidence that his claim is untrue – as journalist George Stephanopoulous did to a politician in November 2015, during the course of ABC’s This Week.

The rules of the game the interviewer is playing dictate that a politician who has been caught in a lie should offer an apology, an excuse, or some compelling new evidence. Instead, this particular politician invoked another strategy: contempt for the rules of the whole reality-based game.

“It did happen, I saw it,” he replied. “It was on television. I saw it.” There was no evidence to support this claim, yet he continued: “It was well covered at the time, George. Now, I know they don’t like to talk about it, but it was well covered at the time.” It wasn’t, and the police said it didn’t happen, but this didn’t matter. Truth may be forceful against lies, but it bothers bullshit about as much as paper darts do a tank.

For Frankfurt, a bullshitter “is neither on the side of the true nor on the side of the false… He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose”. These words, first published in 2005, capture something central to the phenomenon that has come to be called fake news: the belief that emotive impact is not only the supreme test of a story, but the only metric that matters.

‘Alternative facts’ can always be mustered if you equate truth with the most aggressive opinion in the room. The world according to the bullshitter is whatever he wishes it to be. ‘Fake news’ is whatever people he disagrees with are saying. There’s a purity to this that is almost Platonic.

Here are a few further fake news stories for your delectation: “Man stops robbery in diner by quoting Pulp Fiction”; “Presidential candidate sold weapons to ISIS”; “A raccoon has been caught on film riding an alligator”. A little research is all it takes to reveal that none of these things happened (I was genuinely disappointed that the last one turned out to be untrue) – but a little research is precisely what many people don’t want when it comes to their digital diet. Why? Among other things, because many of the information environments we inhabit are magnificently hospitable to meme-sharing, attention-grabbing and OMG-have-you-seen-this-moments – and relatively uninterested in hang-on-a-second-let’s-pause-and-think-twice.

The phrase “information environment” isn’t accidental here, and nor are its ecological overtones. New stories – fake and otherwise – exist in constant, ferocious competition for belief and engagement. In our age of information suffusion, their supply is plentiful while the attention upon which they thrive is scarce. The result: an evolutionary hothouse in which the fittest fabrications leap between clicking fingers and glancing eyeballs, while the weak indifferently wither. It doesn’t take skill to send a lie skipping around the world: just the shameless repetition of whatever some people want to believe using whatever means have already proved themselves effective.

Fitness, here, is a question of adaptation to two sets of incentives: those engineered by the companies who own our information-sharing systems, and those baked into the brains using these systems by millennia of biological evolution. The result? A festival of bullshit – of attention won at all costs – within which disinformation is not so much the sinister cause of present ills as the symptom of strangely skewed incentives.

We aren’t being manipulated by master persuaders. Rather, we’re choking on mindless memes that will keep on coming – and passing their best tricks on to others – for as long as conditions remain hospitable to bullshit. Which is to say, for as long as it’s both profitable and politically acceptable to sell users’ attention to the highest bidder without regard for facts.

Another philosopher with a fine eye for nonsense, Daniel Dennett, has written at length about the significance of evolutionary pressures in the cultural context. “Once the infrastructure for culture has been designed and installed,” he argues in his recent book, From Bacteria to Bach and Back, “the possibility of parasitical memes exploiting that infrastructure is more or less guaranteed.” Once you have built it, they will come: the freeloading lies, earworms, rumours, conspiracies, and panics.

The most popular fake news is viral in its spread – a metaphor so ubiquitous online that we no longer notice it’s metaphorical – but it also reminds me of a weed, thriving opportunistically in environments too harsh for truth. Indeed, junk culture has some striking parallels to physical junk and its effects. Out there in the real world, indifferent to the stories we tell, environmental degradation threatens us with a world of weeds: choked lakes with few fish, oceans of algal blooms, forests of invasive pests. Online, wastelands inhospitable to complexity similarly expand: unquestioning monocultures of fear and anger, touched with righteous titillation.

Am I sounding too pessimistic? Probably, given the resilience of both truth and human truth-seeking in the longer term. Denying reality may be a fine strategy for information constructs, but it’s not a long-term recipe for political or human thriving. Yet it’s clear that an information environment aimed above all at impact needs serious re-examination if we wish other strains of thought to remain vigorous. Viruses have a habit of killing their hosts.

“The notorious confirmation bias,” writes Dennett, “is our tendency to highlight the positive evidence for our current beliefs and theories while ignoring the negative evidence. This and other well-studied patterns of errors in human reasoning suggest that our skills were honed for taking sides, persuading others in debate, not necessarily getting things right.” This is politics in a nutshell, together with proof – if it were needed – that bullshit has been with us since the beginning.

It will be with us at the end, too, together with everything else evolution has seen fit to gift: our unique capacity for culture and comprehension, our remarkable empathy and violence, our adaptability and intransigence. The question is not whether we can change – because, together, we are always in the process of becoming – but what the systems we are building value the most. Are we playing a game in which truth and lies test their opposed strengths – or one in which winners get to make up the rules as they go along? Are we interested in reality, and what knowledge the future might bring – or would we rather fantasise a life away rewriting history?