Argue with idiots, and you become an idiot.
Trying to identify and inoculate yourself against bad ideas is always worthwhile, but trying to set others strait is a thankless, task. And maybe a pointless one too. A good deal of what we believe to be right is based on emotional feedback. We are predisposed to fall for a comforting lie rather than wrestle with an inconvenient truth. And we tend to be comforted by what’s familiar rather than what makes logical sense. We go with what ‘feels right’ and allow our preferences to inform our beliefs. If we’re asked to explain these beliefs, we post-rationalise them; we layer on a sensible logical structure and bury the emotional roots because we instinctively know that it’s not OK to say, ‘Because it just feels right.’ This is why you never win an argument with facts.
In an article called What You Can’t Say*, computer programmer and essayist, Paul Graham argues that a lot of people probably believe things which aren’t true:
It seems to be a constant throughout history: In every period, people believed things that were just ridiculous, and believed them so strongly that you would have gotten in terrible trouble for saying otherwise. Is our time any different? To anyone who has read any amount of history, the answer is almost certainly no. It would be a remarkable coincidence if ours were the first era to get everything just right.
This is very much the spirit in which I wrote, What If Everything You Knew About Education Was Wrong?
… we’re all wrong, all the time, about almost everything. Look around: everyone you’ve ever met is regularly wrong. To err is human… In our culture, everyone’s a critic. We delight in other people’s errors, yet are reluctant to acknowledge our own. Perhaps your friends or family members have benefitted from you pointing out their mistakes? Funny how they fail to appreciate your efforts isn’t it? No matter how obvious it is to you that they’re absolutely and spectacularly wrong, they just don’t seem able to see it. And that’s true of us all. We can almost never see when we ourselves are wrong.
It’s not so much that I think everyone’s wrong it’s more to get you to think about what you’d do if faced with incontrovertible proof of being mistaken. The sane and honourable response is to change your mind, but it turns out that’s a lot easier said than done. And it all becomes much harder because there isn’t much incontrovertible proof about anything in education; all we have are probabilities.
The Grail Quest of any right-minded educational professional is to seek out error and bias in our thinking and examine the extent to which our beliefs may be improbable. This is surely uncontroversial, but where should we begin? Graham proposes a test to check out whether you’re likely to be mistaken in any of your beliefs: If you don’t have any opinions you would be reluctant to express in front of a group of your peers, if everything you believe meets with official sanction, then the likelihood is, you think whatever you’ve been told to think. “Almost certainly, there is something wrong with you if you don’t think things you don’t dare say out loud.”
Obviously, some of what we daren’t say might just be the rotten fruits of a perverse mind. Also, some people dare to say more than others. Generally speaking, I’m happy enough to be disagreeable and will often say things others might find uncomfortable. I’m claiming this as a virtue, but people like me might be a useful lightning wrong rod: what sorts of things do I get in trouble for saying in this blog or on Twitter? Here are a few of things I’ve blogged about which have upset some people:
- Thinking Hats
- What sort of school I want my children to go to
- Standardised tests are better than teacher assessments
- Why teachers’ intuition doesn’t improve
- AfL is wrong
The fact that people get upset might, of course, be because I’m an insensitive fool, but it could be because some cherished belief has been questioned. If any ideas are characterised as ‘heretical’ or beyond the pale, we should be asking why might people get upset about having their beliefs questioned? Graham suggests one of the main reasons is nervousness. In order to be effective, prohibitions on debate and free-thinking have to be made by those with at least some power to enforce the idea that certain thoughts are unsayable, but not powerful enough not to be unconcerned about criticism. He cites the trouble Galileo got into by suggesting the earth rotates around the sun:
The irony of Galileo’s situation was that he got in trouble for repeating Copernicus’s ideas. Copernicus himself didn’t. In fact, Copernicus was a canon of a cathedral, and dedicated his book to the pope. But by Galileo’s time the church was in the throes of the Counter-Reformation and was much more worried about unorthodox ideas.
Whenever I’ve written about two mutually exclusive ideas someone regularly tries to close down debate by claiming I’ve identified a false dichotomy or tells me that writing about something they don’t like is ‘boring’. Why do they do this? If my arguments are genuinely flawed or dull, why read them at all? Why not simply distribute their own better ideas? Attempts to engage, build on or constructively challenge are welcome, but closing down the debate is always defensive and unhelpful.
A few years ago, those arguing against the excesses Christine Gilbert’s child-centred inquisition were largely ignored. As the pendulum has swung somewhat to the right, various vested interests have increasingly begun to pop up in my Twitter timeline to call me things like “neo-trad”, whatever that means. As those who’ve built their careers on bad science and dubious ideology get increasingly nervous about losing the argument, the discussion moves, inevitably to tone, privilege and bullying. Jo Boaler is an instructive example: if anyone criticises her research she immediately accuses them of sexism and academic bullying. This seems reason enough to be suspicious of her claims.
Graham suggests several reasons for seeking out bad ideas: curiosity, to avoid being wrong, and because it’s good for us to question. In addition he says,
Great work tends to grow out of ideas that others have overlooked, and no idea is so overlooked as one that’s unthinkable… In the sciences, especially, it’s a great advantage to be able to question assumptions. The m.o. of scientists, or at least of the good ones is precisely that: look for places where conventional wisdom is broken, and then try to pry apart the cracks and see what’s underneath. That’s where new theories come from.
A good scientist, in other words, does not merely ignore conventional wisdom, but makes a special effort to break it. Scientists go looking for trouble. This should be the m.o. of any scholar, but scientists seem much more willing to look under rocks.
As Tom Bennett has written recently, there are still plenty of rocks teacher need to look under.
Although nothing should be unthinkable, it’s worth considering whether it might be better to avoid saying contentious things in order to avoid getting embroiled in pointless arguments. If you’ve found something particularly filthy lurking in some benighted crack, Graham’s advice is, “don’t say it. Or at least, pick your battles.”
My advice is a little different. We should always critique assumptions, always examine ideas for fault lines, always to break received wisdom. If you can’t, it’s probably fine. If you can, don’t keep quiet about it: you have a duty to tell people. But don’t get bogged down in destructive disputes about intersectionalism or whatever. Just speak your truth into the void and know that there are a few thoughtful types who seek to explore rather than just confirm their biases.
When is it worth thinking the unthinkable? Always. When is it worth arguing with idiots? Never.
*Many thanks to Greg Ashman for sending me the article.