Emotion trumps evidence in a post-fact world (same as it ever was)

As you’ve no doubt heard, “post-truth” has been named “word of the year” by Oxford dictionaries. But that shouldn’t lead us to think that truth or evidence has ever mattered as much as we might prefer, or that this post-truth world represents a complete break from the past.

All of us have always been susceptible to various forms of irrational thought resulting from bias in how we interpret – and even recognise – evidence, as elegantly illustrated in the work of people like Kahneman and Tversky, as well as in popular science books by Dan Ariely and others (like me and Caleb Lack (sorry not sorry)).

Of course some expressions of these biases are more easy to spot, and more dangerous – confirmation bias, for example, might contribute to reinforcing your racism just as easily as it might contribute to supporting your belief that football team X plays the most elegant style of football in human history.

We shouldn’t fool ourselves that humans in general used to be these rational and objective thinkers. We haven’t changed – what’s changed is our exposure to the range of views held by others, and more importantly the amplification of those views by social and traditional media.

This exposure and amplification says nothing about the quality of the views expressed, of course. And here’s the problem I want to briefly address here: the fact that it’s good for the knowledge-production process to be open to all doesn’t mean that truth is decided by popularity.

Universities or the dinner tables of the upper classes are capable of being filter bubbles just like any other environment, and the assumption that they – far better than the masses – are necessarily more objective and responsive to evidence certainly needs to be challenged.

But challenging that idea doesn’t need to mean rejecting the idea of expertise itself – it means rejecting the idea that good arguments can only belong to those of class X, education Y, or race/gender Z (etc.).

As an illustration of how these biases can exist even in universities, consider the idea that liberal academics significantly outnumber conservative ones in the USA (an idea that enjoys a broad consensus, even though there are dissenters).

If there is a liberal bias, it’s going to be disproportionally difficult (for example) for a religious student to enjoy as much academic freedom (hypothetically, to express a view that homosexuality is immoral), or for a politically conservative student to open a classroom discussion on whether gender-fluidity is a sensible concept.

The problem here (and of course there are conservative universities where the bias would lean in the other direction) is that it’s sometimes only by having those uncomfortable conversations that minds can be changed.

Protecting students – and society, in general – from abuse and intolerance is a good thing, but we can also overcorrect and protect people from any threats to intellectual complacency and groupthink. This is a bad thing, no matter how much we might agree with the tenets of the groupthink in question.

There’s a middle-ground here, even though it’s a tricky one to define and occupy. Just as politics in general tends to polarisation, the situation at universities does also. You can be impatient with caricatures of safe spaces and trigger warnings, as I am, while also believing that some members of the university communities might invoke these concepts too frequently.

The election of Donald Trump to the US Presidency, as well as Brexit, are to my mind good examples of why we need to fight for nuance and avoid demonising our opposition (whoever “we” are). But how do we do so? There are some suggestions in this piece on Trump, to which I’d want to add the following:

  • The “principle of charity” and Rapoport’s Rules – once we know that we’re prone to misrepresenting people and ideas that we don’t agree with, it’s easier to resist those tendencies.
  • Making use of fact-checking resources, and using them. For example, my friend Shane Greenup’s Rbutr, a browser plugin that lets you see, and submit, rebuttals to things encountered on the Internet; or the urban-legend-debunking website Snopes.
  • And finally, all of us need to try exposing ourselves to diverging views, and thinking about their virtues as objectively as we can. Using a “mute” function on Twitter might sometimes be necessary for sanity-preservation, but it’s dangerous (to your own thinking) when you use it to simply construct a filter bubble.

The presumption of being right can allow us to pretend that “shutting down a discussion is the same thing as winning an argument”. But “silence is not persuasion”, as this piece on reactions to Trump’s victory argues, where students were given coloring-in books and puppies to help them get over the trauma.

As is the nature of many such pieces, it’s far too snide and unsympathetic, yes. But it nevertheless makes the good point that the sort of shock and discombobulation a Trump victory resulted in, for some, was because they live in a bubble of like-minded people, never meeting people who disagree with them, and therefore seldom developing the necessary arguments to defeat the views those people hold.

Merely being right isn’t enough (assuming you’re right, of course). Anything can become unthinking dogma, and regardless of the truth of the conclusion reached, something important is lost with dogma, and that is both self-awareness (for recognising the times when you are wrong), and the ability to find common-ground with those who hold a different view.

Also published on Medium.

One Pingback/Trackback

  • An Ardent Skeptic

    Great post, Jacques! You may already be a regular listener to the “You Are Not So Smart” podcast but, just in case you aren’t, here’s an interesting podcast which I think is relevant:

    The link:

    And the description of the episode from the “You Are Not So Smart” website:

    “In this divisive and polarized era how do you bridge the political divide between left and right? How do you persuade the people on the other side to see things your way?

    New research by sociologist Robb Willer and psychologist Matthew Feinberg suggests that the answer is in learning how to cross something they call the empathy gap.

    When we produce arguments, we do so from within our own moral framework and in the language of our moral values. Those values rest on top of a set of psychological tendencies influenced by our genetic predispositions and shaped by our cultural exposure that blind us to alternate viewpoints. Because of this, we find it very difficult to construct an argument with the same facts, but framed in a different morality. Willer’s work suggests that if we did that, we would find it a much more successful route to persuading people we usually think of as unreachable.”

    • Thanks! I’ll take a listen to the podcast – for some reason, I’ve never given YANSS a try, even though I’ve long been aware of it.

  • Pingback: “Post-truth” doesn’t have to mean factual relativism – Synapses()