It’s difficult to ignore the appealing certainty that the times in which we are alive are unique and fundamentally different to any that have gone before. The most cited reason for this is the fact that the internet has changed everything.
Technology has been transforming education for as long as either have been in existence. Language, arguably the most crucial technological advancement in human history, moved education from mere mimicry and emulation into the realms of cultural transmission; as we became able to express abstractions so we could teach our offspring about the interior world of thought beyond the concrete reality we experienced directly.
This process accelerated and intensified with the invention of writing, which Socrates railed against, believing it would eat away at the marrow of society and kill off young people’s ability to memorise facts. He was right. The transformative power of writing utterly reshaped the way we think and how we use knowledge. From the point at which we were able to record our thoughts in writing, we no longer had to memorise everything we needed to know.
But education was very much a minority sport until the advent of the printing press, when suddenly books started to become affordable for the masses. Before Gutenberg, there was no need for any but a privileged elite to be literate, but as the number of printed works exploded exponentially, the pressure on societies to prioritise universal education slowly grew until, by the mid-twentieth-century education became not only a requirement but a right.
The rate at which we now produce knowledge is staggering. The architect and inventor, Buckminster Fuller identified what he called the Knowledge Doubling Curve.* He noticed that until 1900 human knowledge doubled approximately every century. By the mid-twentieth century, knowledge was doubling every 25 years. Today on average human knowledge doubles in just over a year. Some estimates suggest soon what we collectively know is set to double every 12 hours.
No wonder so many have been persuaded that there is no longer a need to learn facts as what we know will quickly be superseded and, after all, we can always just look up whatever we need to know on the internet. This erroneous belief has certainly had a transformative, if largely nugatory effect on education in the last decade or so. I say nugatory because knowledge is only knowledge if it lives and breathes inside of us.
There’s a world of difference between knowledge – the stuff we think not just about, but with – and information. To make sense of the vast swathes of information available to us we need to know quite a lot. If you doubt this, consider what happens when you ask a student to look up an unfamiliar word in a dictionary: they may end up with five or six more they have to look up in order to understand the definition of the first. Some things we just need to know. What we know makes us who we are. Knowledge is what we both think with and about. You can’t think about something you don’t know – try it for a moment – and the more you know about a subject the more sophisticated your thoughts become. In order to critique the world we need to know as much as possible about its science, history, geography, languages, mathematics and culture.
But in response to the apparent obsolescence of knowledge, schools started reinventing themselves as places where children would learn transferable skills which would allow them to navigate the shifting, uncertain world of the future. Maybe the tradition curriculum of school subjects has had its day, as tech guru, Sugata Mitra claims. Maybe all we have to do is teach kids how to use Google and they will magically teach themselves all they need to know? After all, most of what schools teach is a waste of time, it seems. According to Mitra, the Chinese and Americans “don’t bother about grammar at all”. Children don’t need to know how to spell, and “the less arithmetic you do in your head the better.”
There is nothing more philistine, more impoverished than reducing the curriculum to the little that’s visible through the narrow lens of children’s current interest and passing fancies. How do they know what they might need to know? And in any case, do we really want to educate the next generation merely in what we think they will need?
Of course the future is uncertain, unknowable and so how best can we prepare students for it? Well, perhaps we should stop delivering rapidly outdated facts and instead teach students the skills they will need to thrive in the 21st century. And what are these futuristic skills? Typically they are considered to include critical thinking, problem-solving, communication, collaboration and creativity. Wonderful things, all of them – but attempting to substitute them wholesale for a more traditional school curriculum comes with problems.
Problem 1
Are these really ‘21st-century’ skills? Or in fact, hasn’t this stuff always been pretty important? And if it was important for Socrates to think critically, Julius Caesar to solve problems, Shakespeare to communicate, Leonardo da Vinci to be creative and the builders of the Great Wall of China to collaborate – how on earth did they achieve what they did without a specific, 21st century learning curriculum? The point is, these skills are innate human characteristics. We all, to a greater or lesser degree, use them all the time. How could we not? Of course, we can encourage children to be more creative, critical and collaborative, but can we actually teach these things as subjects in their own right?
Problem 2
How, exactly, do you teach someone to communicate or solve problems in more sophisticated ways? What is it we want students to communicate? What sorts of things do we want them to create? What do we want to collaborate on? The problem with attempting to teach a generic skill like critical thinking is that you must have something to think critically about; if you know nothing about quantum physics no amount of training in critical thinking is going to help you come up with much on the subject that is very profound. Likewise, to be truly creative we need to know a lot about the form or discipline we’re trying to be creative in. Skills divorced from a body of knowledge are bland to the point of meaninglessness. In fact, these so-called 21st Century skills are in fact biologically primary evolutionary adaptations. As I explained here, we are innately creative. We solve problems as a matter of course and collaboration comes to us naturally. What makes people appear to struggle with these innate attributes are that we want them to use them to manipulate biologically secondary, abstract knowledge. Anyone can collaborate on a playground game, but to collaborate on finding a cure for cancer you would need a lot of highly specialised expertise. The only thing that makes these innate skills desirable in the 21st century is the academic content on which they depend.
Problem 3
Is teaching facts really such a bad thing? Of course it’s true that we’re discovering new information at an exponential rate and that no one can ever learn anything but the tiniest fraction of what is known. Apparently, when Newton formulated the laws of force and invented calculus he knew everything that was currently known about science. This is no longer possible; as our collective knowledge grows our individual ignorance seems to expand. It might be the case, then, that the amount of new information is doubling every two years – but is it really true that half of what students studying a four-year technical degree learn in their first will be out-dated by their third year, as the makers of Shift Happens assert? Maybe those studying highly specialised areas of computer science will find the programming languages they learn are quickly superseded, but that doesn’t make the practice and discipline of learning them in the first place totally useless. And in most other fields of human endeavour – medicine, engineering, law, teaching – new discoveries and practices build upon a settled body of knowledge. Depriving students of this foundation is in no one’s interest and will do nothing to prepare young people for an uncertain future.
Looking backwards to move forwards
We are still easily seduced by the bright lights and glamour of the new (even when it’s not ‘new’ at all, just packaged and lauded as such). It’s all very well to criticise current qualifications but to suggest that exams should be aligned with some supposed change to the way students’ learn and think is naively foolish. In case you doubt anyone sensible might take this line I give you this example:
Th[e] latest revision to curriculum and assessment has not been designed for young people living in the 21st century, with 21st-century minds, and should be challenged. It does not fit with the era we are living in and it penalises a generation of young people who use their brains and knowledge differently through technology.
We may be living in the 21st century but, despite the many ways in which technology has advanced, we are still very much the descendants of primitive hunter gatherers. If we really wanted exams that fit with the way we think we’d probably be best off testing basic survival skills. Regrettably perhaps, the modern world places an increased value on those brains that can best rewire themselves to cope with applying and manipulating abstract knowledge. Thankfully – as I explained here – although we may be out of practice, remembering stuff is not only intellectually undemanding, it also helps students to think better.
* More properly this should be termed an information doubling curve.
You asked the question of young people: “How do they know what they might need to know?” If we ask that question of educators (and governments): ‘How do you know what young people might need to know?’, how do you reply?
Clearly none of us can see into the future and divine which knowledge will be most critical. We can make savvy judgements regarding the knowledge that ought to be imparted. Leaning on Young, Bernstein, et.al. we can say which knowledge allows us to make predictions regarding the phenomena around us in the world. Knowing your multiplication tables saves a lot of time. We can also make reasonable predictions regarding which aspects of our cultural heritage are still useful to know and understand. Furthermore, we can make decisions regarding the kinds of literature that will expand our students’ understanding of their fellow humans’ experiences, beyond the understandings they would come to in their own neighborhoods and worlds.
I like your answer. Lots of specific examples.
Speaking for myself, because I know a lot more than the average child.
And governments? How do they know what young people need to know?
I’m deeply sceptical of a government’s ability to proscribe a curriculum. That said, we vote for them and they do this on our behalf. If we’re unhappy, we vote them out.
I especially appreciate the timing and content of this post. I’ve been mulling over the issue of student choice in curriculum, and to be honest, I don’t know how it is helpful. I am certainly thankful that I wasn’t the one guiding the content of my education. My world would be significantly smaller if I had.
Thanks David, this is a really interesting reflection. I advocate an approach (and work in a organisation that does too) of explicitly and rigorously building skills such as the ones you mention but I am always wary of a downplaying of knowledge: knowledge is a beautiful, gorgeous thing in of itself.
My contention would be that whilst “21st century skills” are, indeed as you articulate, anything but, they have become far more widely required of the working population and this will continue to be the case in the ‘fourth industrial revolution’. Leonardo Da Vinci might have needed to be creative, but the considerable swathe of his contemporaries who were engaged in menial labour, did not need to be. The collaboration of the builders (largely comprised soliders, prisoners and peasants) of the Great Wall of China was organised by a tiny minority, who gave orders. We can see a similar pattern in literacy…life in the modern world is incredibly difficult if you are illiterate, a relatively modern phenomenon. I would argue that life is going to get hard if you haven’t developed a high level of skills.
Automation means the economy of the future is going to be founded on jobs which can’t be subsumed by artificial intelligence and require us to work to meet challenges like mass migration, climate change, immunity to medicine. This will require empathy, divergent thinking, resilience, collaboration, a willingness to commit to lifelong learning and flexibility. A balanced curriculum where skills such as these are referenced and taught with rigor, and developed alongside knowledge base is
I am really pleased to see you talking about this, the question about whether our school/curriculum are fit for purpose for meeting challenges of the future needs to be one everyone is talking about!
Let me reiterate: “21st century skills” are innate and as such require no teaching. What needs instruction is the academic content which we want students to apply these skills on.
My answer to your slightly off topic question would be how about teaching something worthwhile in detail, drawing relevant links with new and old knowledge, push students to challenge their thinking by providing more information and expose them to strong counterpoints. In short develop all those wonderful skills while rarely emphasising or focusing on them. The choice of topic is going to be dictated by the tutor as they will be unable to do the above in unfamiliar topics.
Most knowledge we learn as children (or adults) has very little chance of being significantly relevant to our future life choices, however the way in which we learn to acquire, process and update that knowledge will be useful practice. Ironically focusing on becoming knowledgeable in something is the best way to develop that rather then taking a educational punt on undefined and poorly understood future skills.
As a pedantic side note David, human data has doubled with the speed you described. Unless we define knowledge as data it has likely grown at a much slower pace. I don’t think it materially effects your argument though.
[…] A new world needs new ways of learning, argues The Learning Spy […]
I really like what you have to say and agree. As I reflect on it, it seems to me that they these 21st century skills, aren’t skills at all but more attitudes. Which you can’t teach, but I guess as educators we can model these attitudes. But you are right, not in a vacuum. You need the knowledge. But what knowledge is the most important? I guess it depends on what is going to be most useful for you in your vocation, so from secondary school onwards, you can focus the knowledge more. But from a primary school perspective, a generalist teacher perspective, what knowledge then should be the focus? Should the the goal be to enhance skills? And by skills, I mean actual skills, not “21st century” attitudes. But communication skills (incorporating reading and writing and speaking and listening), and thinking skills (metacognition, evaluation, summarising etc), and self management skills (organisation, time management, informed choices). I know you can’t teach these in a vacuum, but does the knowledge matter? Maybe there are certain conceptual understandings that are most important?
This is all thinking out loud by the way, not sure if true or not, just reflecting on your article. Thanks.
Yes, I think the knowledge really does matter.
Here are some posts which might help you understand my position:
Why what you teach matters
What do we mean by skills?
What are thinking skills and can we teach them?
What do you think about our methods of assessment being outdated? Outside of the classroom most students – indeed, most adults – never really handwrite anything, yet unless a student has a specific learning difficulty they’re expected to handwrite all of their exam responses. Do you think moving to a paperless classroom – and correspondingly paperless examinations – would be a fairer/better way of assessing students?
I know I prefer working things out on paper – and I greatly appreciated your post about why handwritten notes are more effective for retention than typed ones in January – but is there a case for assessments being word processed? Is the curriculum lagging behind reality, or ought we to be encouraging students – and by extension future generations of adults – to write more, by hand?
It’s certainly something that comes up regularly, in departmental meetings and IT development meetings, and I don’t have any ready, simple answers (indeed, I suspect there is no simple answer).
I think the momentum to a paperless assessment system is inevitable but ponderous. We’ll get there eventually but for the time being it’s impractical to get every student to word process their exams.
Why does it matter which month you type a note? But seriously how many people whose jobs require collaboration don’t use a whiteboard and write by hand on it. How many university courses require hand written work as a form of authentication?
[…] This is an incredibly useful framework for deciding what should be covered in a school curriculum. We should think carefully about whether what we are seeking to teach is biologically primary or secondary. If it’s a primary adaptation, then maybe we don’t need to teach it all as children will have an innate ability to pick it up from their environments. That said, maybe we don need to make sure that children’s environments are conducive to acquiring the folk knowledge we all take for granted. Just because the capacity to learn this stuff is innate, it doesn’t follow that we will learn it if we’re locked in a darkened room. This might provide an argument in favour of ‘play based’ approaches in Early Years to ensure all children are immersed in the kind of environment in which they pick up speech, group cooperation and a sense of self. But, if we’re tempted to teach these kinds of things explicitly later on in education we could be wasting our time. This is the argument I advance against a curriculum based around so-called ’21st century skills’. […]
[…] say that, “Yes of course knowledge is important” in one breath and then bang on about so-called ’21st century skills’ in the […]
[…] For a more detailed critique of ’21st century learning’, see this post. […]
[…] it’s unlikely to be much use in developing resilience (or critical thinking, or any other so-called 21st cenrtury skill) in, say, […]
[…] represent our different cognitive abilities. They’re also, you may have noticed, the trendy 21st century skills we here so much about. Blue pillars are – I think – mostly declarative, while the […]
[…] to being able to interpret financial statements or legal reports. As regular readers will know, I’m deeply sceptical that such generic, transferable skills can be taught. You can’t think about – or interpret – something you don’t know. The idea […]
[…] will still be hampered if you then impose a curriculum that prioritises generic competencies or 21st century skills. We only becomes skilled or competent at thinking about things we know and know well. This is an […]
Hi David,
Wondering if you could clarify something? Are you suggesting that mastery of a skill like collaboration can be achieved passively, without intervention? As in, if students are immersed in increasingly complex knowledge-based situations requiring collaboration, they’ll automatically learn and demonstrate the most efficient strategies?
Relating to your analogy, the journey from collaborating in playground games to solving global health issues surely includes some explicit intervention, no? You mentioned our innate capacity, but capacity is merely another word for potential… which I don’t believe can be realised without instruction.
Perhaps I misunderstood your argument.
What do you mean by “mastery”? Does anyone ever ‘master’ collaboration? What I’m suggesting is that collaborating with other members of the tribe has been essential for human survival for so long that we have an evolved bias towards acquiring the ability without instruction. That does not mean that we cannot additionally teach children about the value of turn taking, building on others’ ideas etc, it’s more that this has far less value than is often thought. The reason that some people may be better collaborators than others is, at least to some extent, a product of personality and, almost certainly, DNA. As with any evolved capacity there will be a normal distribution of ability.
But, the thing that really makes collaboration such a prized asset in the modern workplace is expertise. No one can effectively collaborate with others on something they little or nothing about; expert collaboration relies of domain expertise. When we consider, say, the huge numbers of scientists who collaborated on the human genome project, no effort needed to wasted in teaching these experts how to discuss with each other; the effort was put into designing efficient systems and mechanisms that allowed them to share and discuss easily.
So, to be clear, no amount of instruction can a) teach somebody to do something they can already do (trying teaching a normally sighted person to see better!) and b) replace the requirement for domain expertise.
[…] discount knowing stuff because the international consensus is that 21st century skills are more important than knowledge per se. These 21st century skills are generally recognised to be the four C’s of communication, […]