How do you stop people believing myths? The sort answer is, it depends on how strongly people believe the myths. I’ve just read The Debunking Handbook – an excellent, free and succinct (only 9 pages in length!) manual produced by Sceptical Science for tackling misconceptions. In the section on what it refers to as the ‘Worldview Backfire Effect’ it makes the point that, “You … stand a greater chance of correcting misinformation among those not as firmly decided about hot-button issues. This suggests that outreaches should be directed towards the undecided majority rather than the unswayable minority.”

When I wrote What if… I accepted that no matter how I presented my arguments and regardless of the evidence I assembled, there would be some people who were so firmly wedded to an opposing set of beliefs that I could never convince them. But for those who were undecided I went to a lot of time and trouble to anticipate how and why they might disagree in the hope that this approach might at least make them more aware of their own cognitive biases.

But when it comes to the unswayable minority, I have to accept that, for some, I am a hate-filled fascist who hates children and kicks puppies. They are likely to disbelieve anything I say simply because it’s me saying it. Even more moderate and reasonable people with whom I disagree tend to see me as closed-minded and inflexible in my views.

Consider this recent exchange on Twitter. First of all, self-confessed evidence sceptic Sue Cowley made the following statement:

To which David Jackson, a partner of the Innovation Unit, responded by saying, “As with PBL. Some things are true beyond ‘evidence’.”

The idea that a thing could be “true beyond ‘evidence'” intrigued me so I asked – very politely – how that work. He replied with this:

Being sceptical – it would appear – means that I am less likely to understand or accept how Project Based Learning (PBL) could be true beyond ‘evidence’. The implication being, we should accept that it’s the right thing to do as an article of faith. This is, as far as it goes, fair enough. Psychologist Daniel Gilbert suggests we are predisposed to take pretty much anything, even obviously nonsensical or ludicrous things, on faith. He submits that in order to try to understand a statement we must first believe it. Only when we have worked out what it would mean for the statement to be true can we choose not to believe it. So although certain beliefs are contested, I’m willing to accept, for instance, that the Holocaust occurred, that Neil Armstrong walked on the moon and that Elvis didn’t. Others may not be so eager to accept these articles of faith, but in order not to do so they must first believe them. So, while I understand how faith works, my scepticism means I’m unlikely to accept religious claims as true in the absence of substantiating ‘evidence’.

Jackson suggested that my inability to accept his worldview was “a mindset thing” i.e. my refusal to believe was evidence of a fixed mindset whereas as his unquestioning belief should perhaps be seen as him having a more elevated growth mindset.

Now, I’ve logged my scepticism of the ‘mindset thing’ before and won’t go into it again here, but I do think it’s interesting that being prepared to accept something without evidence should be praised as ‘growth mindset’ instead of condemned as credulity. In the end Jackson closed down the discussion by calling my questions an ‘inquisition’ – a position from which it’s much easier to dehumanise opposing opinion and dismiss criticism.

And dismissing criticism is vital in reducing cognitive dissonance. When we come across information which contradicts our beliefs we must choose one of three options:

  1. Change our beliefs to fit the evidence.
  2. Seek out new evidence which confirms the belief we’d prefer to hold.
  3. Reduce the importance of disconfirming evidence.

Cognitive dissonance has a dramatic impact on how we react when confronted with folk who disagree with our most fervently held beliefs. We tend to assume they must be ignorant, stupid or evil. When we’re critical of anything that someone else holds dear, the standard response is for our opponent to point out that we clearly don’t understand their position. When we present the incontestable evidence that we do understand, opponents often treat us as if we’re a bit silly: only an idiot could believe anything so ludicrous and patently untrue. When they finally accept that our counter-arguments are sufficiently cogent that we prove ourselves to possess at least a modicum of intelligence, there are only two remaining propositions: either we are evil or they are wrong. Of these, it is far easier, and massively less damaging to the sense of self, to assume that we must be unscrupulous villains seeking to poison children’s life chances.

What I think some of those who see me as the member of some sinister right-wing ‘neo-trad’ conspiracy forget (or are perhaps unaware) is that it wasn’t always so. I made the point to Jackson that I began as a believer of PBL and only when presented with ‘evidence‘ did I come to change my mind. Does this make me more open or closed-minded? Should people be more alarmed by my scepticism of arguments which are unsupported by evidence, or by Jackson’s and Cowley’s scepticism of ‘evidence’?

After enough time and enough repetition, even the most troubling ideas can be accepted as true. Schopenhauer (may have) observed, “All truth passes through three stages. First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as being self-evident.”I labour on in the hope that the swayable majority will continue to be persuaded by the balance of probabilities.