Today I got to rub shoulders with the great and the good at Bethnal Green Academy (second most improved school in the land, dontcha know?) for the Teach First sponsored launch of Ben Goldacre‘s thoughts on Building Evidence into Education.
I somehow found myself on a guest list that included Michael Gove, Kevan Collins, chief executive of the EEF and sundry academics and educational big wigs. Fortunately there were also a few familiar faces: I was joined by fellow rent-a-gob Tom Bennett who is an old hand at these sorts of affairs and handled himself with considerable savoir faire and aplomb, as well as the ever elegant and debonaire David Weston, chief executive of the Teacher Development Trust.
Gove opened proceedings by announcing that the best policy is formulated and developed by practitioners, and added that the views of politicians are a poor second to the practical experience of teachers. I was slightly startled, as this belief has remained cunningly concealed for the entirety of his tenure to date. I look forward to being consulted and having my views respected in the near future.
He quickly handed over to the impish and chaotically be-coifed Ben Goldacre who acknowledged the difficulty of teachers being told what to do by outsiders. Evidence in education is actually about “empowering teachers, and setting a profession free from governments, ministers and civil servants”. It would be “bizarre”, Goldacre told us, for the Department of Health to tell doctors which treatments to use and it should be seen as equally odd that The DfE routinely instructs teachers on how to teach. (Mr Gove nodded vigorously at this so presumably he intends to stop doing it.) He set out his stall by stating that teachers should undertake randomised trials as a regular part of their professional practice and that the results of these trials should be widely disseminated to teachers. This would, he suggested, lead to teachers being more thoughtful, critical consumers of educational research and would enable them to generate new ideas for future research.
But what of the accusation that running randomised trials is unethical? If you believe a particular course of action is the best one, is it fair to deny it some students? Medicine and social science are littered with examples of treatments or initiatives which practitioners have been convinced were right only to find that, after reluctantly engaging in ramdomised clinical trials, that they were actually causing more harm than good. The point is that how will we ever know whether our pet pedagogical theory actually has the impact we think it has unless we submit it to fair testing? It’s all well and good to cry that what works is what works, but how do we know? Yes, your exam results might be good, but might they be even better if you stopped whatever it was you so passionately believed in?
Perhaps a more meaningful criticism of epidemiology in education is Ben’s belief that medicine and education are essentially comparable. They’re not. Although patients and students, doctors and teachers might share some superficial similarities there are many more differences. You can reproduce the effects of a drug in controlled conditions and therefore be fairly certain it’s having an effect. You can’t do the same with a pedagogical intervention: teacher quality, student motivation, time of day, a fly in the room, someone farting can all cause wildly unpredictable, unreproduceable results. Not only that, we have the Hawthorne Effect: a point raised by both Tom Bennett and a student in the audience. If we conduct trials on students we will affect their behaviour just because they know we’re conducting the trial. This being the case, what can RCTs really tell us? And if they tell us something that defies common sense, what then?
Ben told us the RCTs ought to be made straightforward to run and increasingly commonplace; it should be the norm for teachers to be conducting fair tests on new ideas. The problem is that, currently, there’s nowhere for geeky teachers to go to register their willingness to take part in such trials. What’s needed is perhaps a network which connects teachers together so that they can participate in large studies with the view to being able to design their own methodologically robust research questions. Now, where on earth could we find such a network? Whilst it might require slightly more than just a teachergeek hashtag, it wouldn’t require much more. All it would need would be for universities and research institutions to commit to it and we’d be away.
Ben ended with a call to arms, stating that teaching was poised on a “precipice” and that teachers needed to claim their professional independence. Cue more acquiescent nodding from Mr Gove.
Now, this is all fine and dandy, but I a few issues with Ben’s proposals. Firstly, although there is widespread acceptance of the view that the best way to improve schools is to improve the quality of teachers, there is also a well worn and very public discourse that teachers are not knowledgeable enough to be trustworthy. And the problem with that, is that it’s true. I’m somewhat of a rarity in that I spend so much time and effort reading about education research and reflecting so publicly on my practice. Yeah, of course loads of other people read edu-books and blog (many of them much better than I do) but we’re in a tiny minority. I’m constantly shocked about how little many teachers know about teaching.
But perhaps I shouldn’t be surprised. We’ve become used to enacting top down policy and being rewarded for compliance. How many heads would be happy for their staff to run randomised trials on their school’s behaviour policy? What would happen if something went wrong? And, more crucially, what would happen if you found it was causing more harm than good? Would this finding be welcomed? Currently, being seen as ‘challenging’ is not a good thing. We know that, unless we want our cards marked, we’re supposed to keep our heads down and do what we’re told.
Kevan Collins suggested that we need more professional autonomy and that teachers and school leaders need to act like professionals if they want to be treated like professionals. I agree. But this isn’t going to happen by itself. One audience member made the point that policy makers should run randomised trials on new policy areas before rolling them out across the whole country. This seemed to make perfect sense and be the kind of clear lead an Education Secretary should espouse. I was dumbfounded by the, apparently, apolitical Goldacre say in response that we can hardly expect policy makers to run randomised trials unless we, as teachers, embed the culture in our profession from the ground up.
What Ben fails to understand is the lamentable state of much of the guff that gets touted about in the name of CPD. There is no quality control. Still, in 2013, there are teachers being trained in Brain Gym, learning styles, multiple intelligences and all sorts of other ineffective atrocities. If we really want a future where teachers claim their professional status and commit to being critical and reflective (and I do) then, unfortunately, we need some top down policies imposed to make it happen.
For all the perceived faults with the NPQH, getting rid of the requirement for heads to pass some kind of qualification is a most retrograde step. All school leaders, especially those responsible in any way for the training and professional development of other teachers must be required to complete some sort of professional qualification in education theory and research methods. What goes on in ITT is haphazard at best and then, for the most part, teachers are left to their own devices and abandoned to the tender mercies of ignorant school leaders. Sure, they’re well intentioned, but we all know what the road to hell is paved with!
I’d had high hopes that I’d leave invigorated and clear on how I could set about restructuring my own practice with deep roots in evidence and research. I ended up none the wiser. At the close we were told that everyone obviously agreed with Ben’s ideas, and given absolutely no way forward. We all clapped politely and filtered out in dribs and drabs. The consensus I gleaned from conversations with fellow delegates was that it all sounds lovely but utterly impractical.
So, there it is: a warm. fuzzy, pie-in-the-sky idea which, without clear leadership, will be mere sound and fury, signifying ab-sol-utely nothing! I very much hope all Michael Gove’s nodding translates into meaningful action. But I don’t expect all that much. Obviously, I will continue developing my own professional practice and will attempt to run my own small scale RCTs (I have an idea for a short term trial looking at teaching strategies in the lead up to Year 10 mock exams after Easter), but will anyone else join in?
Judgement: requires improvement
Here’s a pdf of Ben’s paper: see what you think. Am I being harsh?
And here’s an alternative view on the same event: Evidenced Based Practice: why number-crunching tells only part of the story by @drbeckyallen
I wasn’t there but did follow it on twitter and read the report, which was suprisingly brief with no real recommendations to it. I agree with just about everything you say but just to add that I am concerned about the overall approach of ‘treating’ children and patients (in deed they could be one and the same at times). I thought we were moving away from a paternalistic approach to education and medicine where the practitioner always knows best, towards something more responsive to the input of wider stakeholders. What about the views of students, parents/carers and employers for example?
Nick, we did have some student voice! A Year 12 student at Bethnal Green Academy asked if wasn’t a bit crap experimenting on students. Always nice when tokenism exposes the lack of imperial trousers.
Randomized trails work in medicine because everyone agrees on a set of fundamental goals – ‘Is the patient still alive after x years?’ – which we can measure and which are quite tough to game.
Not convinced the same is possible in education.
As a PhD student in education I also followed the day on twitter with interest, thanks for this blog which sums up a lot of my feelings/hunches on the event. Thought it laughable a report on a move towards evidence-based research practice – was devoid of references.
I’m researching arts education &am fascinated by different research & evaluation methodologies & definitely want to move away from some of the fuzzy, small scale ethnographic reflexive stuff that has defined my field. But there’s so much about RCTs that are problematic, and the drive to make ed research=med research, even for someone like me just starting their research career, seems seriously done…
Also, although I’m 100% behind teachers being drivers of research, seemed from twitter feed the mention of university-level ed research wasn’t really addressed. Surely via unis & by joining wider research community (as partners, not participants) is the way forward?
“Yeah, of course loads of other people read edu-books and blog (most of them much better than I do) but we’re in a tiny minority. I’m constantly shocked about how little many teachers know about teaching.”
THIS. And often they don’t actually care, either. One other person amongst the maths and science staff had heard of Hirsch.
More worryingly, I see a significant group of teachers who actively don’t want to know what the literature says. They like the way they teach now. This applies equally to the crusty 55 year old who wants to stick to his age-old resources as to the bushy tailed NQT who objects to silence because “it bores me, y’know”. It takes balls to face the fact that empirical evidence might say learning is improved by a method that makes the teacher’s job harder or less enjoyable.
My friend runs an NHS clinical research unit, and it’s a hugely complex operation. I’m not totally convinced we can apply the same approach to education. For a start, she has to recruit her volunteers. Do we just assume the consent of ours?
The other issue of course is where the child starts from when the research begins. How can you measure the complexity of the child’s total life experiences?
Of course research is vital, and teachers are finding new ways to share ideas and experiences, see pedagoo.org for one great example. But teaching is surely as much craft and art as science?
I dislike the labels as much as Ben does, but more because they offer a ‘magic bullet’ when it’s all about complexity really. Bits and pieces of everything often seems to work.
I read a lot of research papers, but quality control can be quite hard. I find it very frustrating to find so many papers describing the problem and coming up with no effective solutions. It feel sometimes like high effort for low reward. Any suggested sites for access to quality and useful research?
Reasons for optimism:
1) (and only) It’s not that difficult. Here’s something I did http://improvingteaching.weebly.com/hinge-questions.html in a very reasonable amount of time; I would tweak the measurement and the control groupings to make the study replicable (since I was only seeking to satisfy myself first time around).
What’s needed – a rethink of how CPD time is used – instead of suggestions like recertification, or lecturing us on the latest school policy changes, why not ask every teacher to do something like this every term? One more prerequisite – leadership – encouragement to look into things that have evidence behind them – and a willingness to face the uncomfortable truths this might bring up – as you suggest.
I see how this might be a jump in many schools – I spent three and half years refusing to follow school policy on grades on work – evidence be damned. But I don’t see it as a massive jump for willing headteachers to make – your comment on the NPQH is well-made…
The point about professionalism (and trust in teachers) is important too – equally I think this is something teachers can best remedy through their own actions – it requires more of us to think, research, write, share and encourage others to do the same.
This is probably an over-simplistic theory of change – but my optimism rests.
What strikes me as odd is the contradiction between Gove’s nodding along, and his deconstruction of professional training. Trials in medicine are not carried out by people who happen to have a roughly appropriate degree and a willingness to “have a go”. They are coordinated by people who specialise in particular fields of expertise, trained for their profession, and part of a network of professionals working in a sector which still places considering focus on the proper education and training of its new intake.
You can nod as much as you like, but if the next generation of “researchers” doesn’t even need to bother with any degree-level study of education, what chance is there of us embedding research-based practice across the board?
Love this until you say… ‘What goes on in ITT is haphazard at best ‘
What is your evidence for this? (You can guess which sector I work in from the question!). Here you sounds more like Gove than Goldacre who, though I disagree with his RCT idea, at least tries to provide evidence for his assertions. It may have been a throw away remark but these are dangerous times…
Ali – my evidence is having worked with trainees & ITT providers for the past 12 years. Yes, this is anecdotal but it’s enough for me to feel my comment is justified. Do you want me to provide details?
Michael, you’re totally right. I worry about what’s happening to teacher training – without time to learn the theory, the research & become critical & reflexive, won’t any future school-based RCT just be a case of teachers working as (unpaid!!) gov researchers providing evidence for the next policy change? What Gove/Goldacre are talking about needs training, critical & reflexive researchers & as much as I love & respect teachers I think some of the other comments on where we’re at on this front are right. For this to work, you need teachers who are trained researchers & have an ongoing, reciprocal relationship with researchers (who are equally trained in current teaching methods!!)
This is my dream, I suppose. My uni department is now in the process of splitting into two small ‘centres’ – on concerned with academic teaching & research, the other with teacher training. In my experience this (or something similar) is happening in many places as unis are being forced to wind down PGCE training… How are we ever going to develop communities of research practice like this?
Also Steve, much research is in peer-reviewed journals, which you have to pay a stupidly large subscription to access. If RCT & the like become the norm, schools would presumably need to start paying for these subscriptions, or how would you have access to initial research to be able to design your study? Another reason to get teachers into partnerships with unis (who already have this access) or, of course, make academic journals free to read, but that’s a whole other debate…
Really interesting. I agree with your caution here..and I’m a huge Ben Goldacre fan. There are many ways to find out what works in a particular context..eg a three person focus group can tell you a lot about how their thinking developed..and you learn from that in a nuanced context specific way. Scale that up to a randomised trial and all you get is a bland averaging effect that may suggest a broad general tendency …but nothing more nuanced. This is the danger of effect-size cultism (!) A strategy may have a Hattie effect size of 0.73 but every teacher who tries it will do it differently and the range would be huge..even if the average was similar. (Dylan Wiliam: a person with a foot in boiling water and a foot in freezing water is not comfortable on average).
Even in physics, where there are simple verifiable laws..complex systems are unpredictable..eg drop a marble exactly the same way…it goes to a different place each time. Drop a bag of marbles…it is highly chaotic within some general overall pattern. That is learning….
Tom
At our academy we have over 40 members of staff doing a practitioners masters degree. Tough for staff but based in their teacher and raising standards. We are now getting teachers from other schools in our alliance as well.
Chris – what does the practitioners masters involve?
[…] it directionally right, but was sceptical of its practicality. David Didau said many thought it utterly impractical. Dr Becky Alan welcomed it, but urged us to ask not just what works, but what works under which […]
I think the debate should be around how we formalise the process of evidence based innovation in education. I don’t know one teacher who doesn’t innovate within the classroom but it mostly stays in their classroom and rarely encounters reflective scrutiny. Think how many classroom innovations have been dropped because of a lack of understanding of educational theory to develop it to a point where it has a significant impact.
Too often we as professionals look at something (usually through tired and desperate eyes looking for a solution to an impending or current problem within our own practice) and say ‘I can see that this works’ rather than ‘why does this work?’ or ‘how could it work better?’.
It’s encouraging / building time for that practice, which will enhance education. It should be tied into appraisal systems as well.
Chris – it certainly should be part of appraisal, but if we value teacher research we should allow time & training also.
An interesting post, with even more interesting comments. In particular, regrettably, those that don’t paint teachers themselves in the best light. I think the most apt observation is that which has us as a profession trained in conforming to demands which flow downwards. And that’s the challenge; how to get every single professional to a stage where they have earned the autonomy which so many of us wish for. To paraphrase The Smiths, we just haven’t earned it yet, baby.
Iorek: ain’t that the truth. There’s a good motto in there – What would Morrissey do?
Great post. Wheeling Ben Goldacre out is another example of the top-down quest for the silver bullet, while doing nothing about the culture of fear and compliance. I blogged about this myself yesterday: http://engagedlearning.co.uk/vegemite-and-transformative-teaching/
Would appreciate any comments as I’m conducting a RCT on the value of ed blgs!
Have enjoyed catching up with your blog. Keep up the fight!
[…] 14th March: Building evidence into education. David Didau, The Learning Spy. https://learningspy.co.uk/2013/03/14/building-evidence-into-education/ […]
I really like reading some of the blogs about education research etc etc. I just wonder where you guys get all the time? As a head of English in a large comprehensive, I am still bogged down with marking, planning etc etc most evenings.
Juli, I do it by not having friends*.
*or only friends as edu-nerdy as me
You may have noticed that I haven’t really blogged for the past couple of months. That’s term 5 for you!
Has anyone actually noticed that most of the EEF trials do not seen to have reaches a level of significance that would be considered a finding in medicine
To be clear if you do an RCT you are supposed to state your null hypothesis if you find a non-significant result (for example, below p = 0.05). The null hypothesis is the statement that there is no improvement or change. Yet EEF claim so many months’ learning for their ‘trials’ when in fact most of the results (nearly all) are not significant. Take a look. Evidence based quackery if you ask me.
Amelia Jones