No one seems clear who first said it, but it’s become an abiding truth of journalism that, “If a dog bites a man, that is not news. But if a man bites a dog that is news.” To publish an article in which an octogenarian educationalist says basically what he’s been saying for the last few decades would not be news. But if said educationalist were to bite another well-known bastion of traditional education? Publish and be damned!
So, in a recent article about the nonsense of selecting what to teach based on whether material is cognitively ‘age appropriate’, ED Hirsch Jr makes the following aside in the midst of a solidly sensible and perfectly reasonable argument:
We have become disappointed in policies and programmes that seemed experimentally promising, such as smaller class sizes, direct instruction and Success for All. They were all supported by carefully conducted experiments, but in the long run they have disappointed.
Somehow this got turned into, “There is no scientific basis for Direct Instruction” on the front page of the TES magazine. Perhaps unsurprisingly, this has been seized upon as some sort of proof that so-called traditional teaching methods don’t work. It should also not come as too much of a surprise that I don’t think this is a reasonable conclusion to draw from Hirsch’s remarks.
First, we have to work out what Hirsch means by direct instruction. It may be that he’s using it as a catch-all term for what’s more commonly referred to as explicit, whole-class or teacher-led instruction. This would include any teaching method where teachers tell kids stuff, explain what things mean or direct practice. Seeing as this approach to teaching has held sway for most of human history and has over that timed proved very effective at passing on human knowledge and culture, it seems unlikely that Hirsch is arguing for a more child-centred, discovery approach to teaching. Instead, despite the lack of capitalisation in the TES article, it’s rather more likely that he’s referring to Siegfried Engelmann’s Direct Instruction programme.
Direct Instruction is a very specific method of both teaching and curriculum design. It takes as its starting premise that if children struggle to learn, this should be seen as a problem with the instructional design rather than evidence that the child is incapable of learning. Engelmann sought to eliminate anything in his instructional sequences that could be considered ambiguous or misleading with the result that his scripted programmes could be faithfully reproduced by any teacher anywhere.
The “carefully conducted experiments” Hirsch mentions might be a reference to the humorously titled Project Follow Through, which ran from 1967-1995 – the largest, most expensive education study ever conducted involving over 70,000 students in 180 schools across the US. Follow Through pitted various approaches to teaching against each other in a straight horse race with Direct Instruction the clear winner in all categories. Not only was it the most effective programme at improving students’ literacy and maths skills, it also outperformed all other models for more generic cognitive skills and other affective areas such as self-esteem and student engagement.
So, what happened? Did Direct Instruction go on to conquer the world as the most effective method for teaching children? No. In fact, As Douglas Carnine (2000) observed:
[DI] was not specially promoted or encouraged in any way…federal dollars were directed toward less effective models in an effort to improve their results…. [S]chools that attempted to use DI —particularly in the early grades, when DI is especially effective—were…discouraged by education organizations.
Hey ho. The fact that few teachers in the UK are even aware of what DI actually is, let alone used it in the classroom speaks volumes. No wonder Hirsch finds it disappointing.
But, that’s not all. It turns out Engelmann and Hirsch have some beef. In 2002, Engelmann took umbrage at an article Hirsch had written criticising educational research as cargo cult science. Engelmann spelt out in no uncertain terms precisely where he felt Hirsch had failed to appreciate the merits of a study such as Follow Through. The debate between these two elder statesmen of traditional education makes for interesting reading and I have some sympathy with the positions of each. Hirsch is right to point out the inability of classroom research to find out why an intervention might work, but Engelmann is right to say that it can still prove that one approach is more effective than another. Could it be that Hirsch dropped in his DI reference as a bit of academic afters? I couldn’t possible comment.
To conclude, we may not know what the best way to teach is – we may never know – but we do have very clear guidance, from a wide variety of sources, that some interventions are more successful than others. I’m not claiming DI is the way to go if you want to deliver the sort of knowledge-rich curriculum Hirsch advocates, but there is clear evidence, both from laboratories and from field testing, that minimal guidance is less effective than more explicit approaches for school students.
Ignoring what evidence there is in favour of what you prefer to be true is exactly what Hirsch says we need to stop. He argues the problem is that a belief in what he calls “providential individualism – the focus on the unique individual rather than on acculturation, combined with the belief that some supervising providence, like nature or the free market, can guide our educational policies. On the contrary, it’s neither providence nor nature, but we adults who need to decide quite specifically what our children should know and be able to do.”
I’d end by saying, and who could argue with that? Except of course, they are all too many.
The article that Englemann took umbrage with is here: https://t.co/bTgsj2aKNJ
Essentially, Hirsch has long argued that education research was “difficult and undependable” whereas cognitive science arising from well-controlled laboratory studies gives rise to “reliable general principles” which can be applied more effectively within education.
I suspect some of that nuance was lost in the article.
If Hirsch is here recapitulating the argument he made in Why Knowledge Matters, then the headline is certainly terribly misleading.
Hirsch discusses the Direct Instruction and Success for All programs specifically on pages 46-48 of this book. His basic critique is that both DI and SfA show fadeout after early elementary. Here’s a portion:
“That same view–that content is basically a vehicle for skills–also characterizes scripted programs like Direct Instruction and Success for All. They are superior to natural-development approaches in teaching the mechanics of reading. But we now know that this superiority fades over time [‘They were all supported by carefully conducted experiments, but in the long run they have disappointed.’], once the mechanics are gained. Both natural-development programs like ‘balanced literacy’ and intensive-skills programs like Direct Instruction and Success for All end up very much the same by grades 5 or 6.”
As perhaps one of the few remaining teachers involved in this discussion who has taught using Englemann’s Direct Instruction course materials, I’d like to add an observation I think is missing.
DI seems to have been most successful in early and special education. If you get a chance to read Englemann’s guidebooks and the course materials, you’ll see how basic are the concepts and knowledge covered. And when you get to grips with the teacher text to use when leading the course materials, you’ll wonder how teachers could follow this approach with older, and non-special school pupils in a U.K. school.
Maybe I didn’t get to see the materials for older, mainstream pupils, but it’s hard to see British teachers using scripted lessons and their pupils not laughing, so alien is this cultural approach.
So while the curricular design of DI may be ultimately logical in its progression, its undiluted teaching method may remain a cultural step too far for British schools.
Do you think that this same approach could be used with a knowledge rich curriculum, or do you think that DI approach wouldn’t be compatible?
The idea of an system to enhance clear transmission sounds very appealing, but I’ve never tried it in practise.
Yes I do think it possible. But you’d need to match a modern version to the cultural context of the U.K., I’d say, to give It its widest adoption.
There is a consortium of schools in Baltimore that use both Engelmann’s DI and Hirsch’s Core Knowledge. From what I understand these schools have incredible outcomes, even though most of their students come from disadvantaged backgrounds. It’s called the Baltimore Curriculum Project. http://www.baltimorecp.org/history.html
J.D. Fisher, does Hirsch provide any reference for the claim that students of balanced literacy and DI end up at the same (low) level of literacy by grades 5-6? I haven’t heard claim that before.
Hi, Angie.
Hirsch cites a 1982 study: “A Follow-up of Follow Through: The Later Effects of the Direct Instruction Model on Children in Fifth and Sixth Grades,” American Educational Research Journal 19 (1982), pp. 75-92. The study authors are Wesley C. Becker and Russell Gersten.
Not sure about a reference for balanced literacy.
Thanks! Just found a link to that paper. The authors looked at students 5th & 6th graders who had received direct instruction in grades 1-3, but were no longer in being taught through DI. They found that students did retain previously learned knowledge and skills from those early years. However, they struggled to learn new knowledge & skills in the upper grades. The authors suggested that these students need continued instruction in DI in the upper grades in order to continue making gains, which seems like a reasonable suggestion.
https://www.researchgate.net/profile/Russell_Gersten/publication/240801762_A_Followup_of_Follow_Through_The_Later_Effects_of_the_Direct_Instruction_Model_on_Children_in_Fifth_and_Sixth_Grades/links/54595cfe0cf2bccc4912bb5a.pdf
Ha, interesting.
Hirsch even concedes in the book that a core-knowledge sequence of some kind should be maintained throughout elementary to be effective. So, he’s applying a standard to DI (that it can “launch” students into academic success and then get left behind) that he doesn’t apply to his own treatment.
I suppose this would be fair if DI billed itself at any time as a methodology that “launches” students as described. But that doesn’t strike me as true.
Well caught, Dave, and excellent analysis. I think both Hirsch and Engelmann have critiquable foibles, and I like your take that this may be a squabble between “elder Statesmen”. I sided with Hirsch on the previous disagreement and I’m afraid I side (as you know) on the side of DI on this one. Neither of them is sacrosanct. Both have important wisdom for the ed world.
Sorry, didn’t sign … WISE Math here = me
We introduced DI 12 months ago to my school and it’s had an extraordinary impact on the progress of our weakest students. We also love Hirsch. In our experience, it’s DI that is a highly efficient way to raise the reading age of pupils who arrive in Year 7 with very poor literacy. It’s a Hirschian Knowldge based curriculum that then equips pupils with the background knowledge and expertise to access subject specific texts in an effective way.
The TES headline is a travesty. Hirsch is a strong supporter of explicit, whole-class instruction. He is only pointing out that by itself, no method can lead to long term results in the absence of a coherent knowledge rich curriculum.
It would be helpful if Hirsch clarified his comments about direct instruction and its relation to the Core Knowledge approach. CK is clearly not discovery learning (When I teach the CK Ancient Civilizations unit, I don’t tell the kids to come up with their own questions about what they want to learn about civilizations, then have them create their own investigations to answer questions, then write up the answers on a blog). I thought that the Core Knowledge approach fit within the parameters of the direction instruction approach–explicit, carefully sequenced knowledge-building units of study.
Every day Physics teachers (for example) have to face a situation in which they’ve got to teach something like ‘Young’s Modulus’. You can learn this from a book, you can be told it from the front of a class of 10,20, 100, 1000 pupils. You can even be told how to conduct an experiment to show the relationship between weight, and stretching of springs etc etc. without actually doing it. Then when you’re asked a question about it in an exam you can get it 100% right. You can also teach it as an ‘experiment’ and in so doing, you can teach it pretending that ‘we’ don’t know how it’ll turn out, or you can teach it telling the students that you know how it’ll turn out and let’s see if it does.
At the end of either of those procedures, you may or may not know more or less about Young’s Modulus,you may or may not do better in the exam. You may or may not know more about the principles of experimental science. You may or may not be able to apply those principles to something else that you are finding out about.
These are the day to day issues that face science teachers, aren’t they?
Thoughts?
What do you make of these?
Kids did a Mexican wave for first waves, and then talked about how sound was not like that at all! Then, the kids made themselves into a solid by standing shoulder to shoulder. My wife then made a longitudinal vibration (a push!) – the other particles (kids) passed that on to their neighbour. This allowed them to talk about compression and rarefaction, which are complex concepts for year 7s but because they could physically see it it made sense to them.
——
I’m doing a sound investigation tomorrow with Year 4 – investigating the sound-proofing properties of various materials. Hopefully we will measure the sound with data loggers, if I can work out how to make them work! Otherwise I’ll use an app on my phone. The aim of the lesson is as much about the children planning an investigation (including making it a fair test) as it is about identifying the best sound-proofing.
——–
Year 8 group. Investigating circuits. Kids were given circuit building kit. Shown how to use meters to measure voltage and current. Then given a number of series and parallel circuits to build. The idea being they were asked to observe the brightness of the bulbs, measure the voltage across the bulbs, he current through them etc. Then tasked with explaining their observations, using the electricity model we had developed previously. This took a sequence of four lessons, all of which were practical. You can’t demonstrate rapid and sustained progress, but you can actually teach and get your kids to understand stuff on a useful level. I’m getting out of teaching as soon as I can because this sort of teaching is regularly judged as inadequate.
——
Investigating the effect of different drinks on our teeth with year 4. Placed eggshells into cups filled with variety of liquids (fresh orange juice, orange squash, water, milk, coke, blackcurrant squash). Predicted coke would do the most damage, but it was fresh orange juice. Also learned not to leave a cup of milk laying around for two weeks
———————
Friday’s practicals – access for science did displacement reactions to investigate reactivity of metals, AS Chem found the enthalpy change of two reactions then used Hess’s law to work out the enthalpy change of a reaction they cant do practically, A2 investigated organic functional groups doing a series of qualitative tests using chemicals including the infamous DNP then access for HSC got all excited burning magnesium as a lead into ionic bonding.
————————-
Tomorrow year 8 will be doing food tests. Thursday year 12 did investigations on polar/ non polar properties of some liquids. Tuesday year 9 burnt magnesium iron wool and copper to investigate formation of oxides. Enough? There’s lots more!
———
Tomorrow, my lab will be turned into ‘outer space’ and year 7 will be navigating around and exploring how gravity affects weight on different planets
——————–
Year 4 we did an investigation to prove that gases have a weight. With my science club we investigated on the best quantities of liquids needed to make the largest bubbles.
————————–
Measuring how size/shape/weight of paper helicopters affects the time taken to reach the ground.
————————
Mixed 7-8-9 class doing Food tests…..looking at what food groups are in which food.
===============
Primary, investigation into forces, children had to design a return capsule for an egg that was launched on a water bottle rocket.
———————-
We investigated how to make objects more streamlined using a variety of plasticine shapes, a stopwatch and a tall tube of wallpaper paste. Messy!
I can see that these activities might be enjoyable and so worth doing in order to develop relationships between students and teachers but – in my view – are less likely to result in concepts being learned than if a teacher were to explain the principles, ask questions to check understanding and then direct practice.
So, is there any point in doing scientific experiments in class (other than the social ones)?
If one of the purposes of education is to ‘produce scientists’ /and/or scientific thinking, how will that happen without learning how to do experiments?
Or do you think that experiments in general are not worth doing – even for scientists?
Of course I think experiments are worth doing – especially for scientists. But remember, when scientists do experiments they’re testing original hypotheses. This doesn’t happen in schools. Children should do experiments because they’re an interesting and unique aspect of science lessons but we should acknowledge that they are likely learn less than they would if given a demonstration and a good explanation. I tend to the belief that getting students to ‘think like scientists’ in the hope that they will then actually become like scientists is misguided. All the work on expert performance over the last few decades suggests that experts and novices think in qualitatively different ways; experts are able to think like experts because they know so much about their area of expertise. Scientists may be inspired by their school experience of experiments but what makes them scientists is their vast body of domain-specific knowledge. Sadly, we can’t skip this vital step. As Emerson put it, “Every artist was first an amateur.”
What about this, for example?
http://science.sciencemag.org/content/332/6037/1516.full
One science teacher tells me he goes experiments like this:
“Generally begins with a discussion of principles, followed by procedure and ending in investigation to allow the students to apply the principles in their analysis and discussion.”
That sounds fairly sensible.
Interesting comment here:
“There has been a controversy about investigatory work for some time. This stems from Nuffield in the 1960s where students were doing investigations but what happened when they didn’t get the ‘correct’ scientific answers? The problem is can you have a genuine innvestigation if the answers are already known by someone in authority who’ll put you right and tell you exactly how Nature works. One solution is not to bother with investigations but use practical work as illuminative of well-known scientific principles such as elasticity, the role of the teacher being to ensure there is constant reference between the world as shown and the world of theory. However in general the world is messy and there is some attempt to do investigations where the solutions are genuinely open and often raise even more questions. That requires confident teaching, ie not feeling insecure that you don’t know the ‘right’ answer. So a long winded answer is a mixture of all three but the second open ended investigations often require student focusing but enough autonomy for them to decide the questions are genuinely open and real problems for them.”
Doing experiments can also help children to appreciate that ‘science is not just another opinion’, or to develop skills, e.g. being able to measure Voltage. Generally the objectives are conceptual, epistemic, or skills related (or a mix) and it doesn’t do justice to science education to solely focus on conceptual development.