In the past, school improvement was easy. You could push pupils into taking BTECs or Diplomas (sometimes with 100% coursework) equivalent to multiple GCSEs; you could organise your curriculum to allow for early entry and multiple resits; you could bend the rules on controlled assessment and a whole host of other little tricks and cons intended to flatter and deceive. Now what have we got? PiXL Club?
As Rob Coe laid bare in Improving Education: A Triumph of Hope over Experience, school improvement has been a tawdry illusion. Evidence from international comparisons, independent studies and national exams tell a conflicting and unsavoury tale.
As Coe said, “The question … is not whether there has been grade inflation, but how much.”
Scurrilously, he suggested various ways we can make our efforts at school improvement look like they’ve worked:
- Wait for a bad year or choose underperforming schools to start with. Most things self-correct or revert to expectations (you can claim the credit for this)
- Take on any initiative, and ask everyone who put effort into it whether they feel it worked. No-one wants to feel their effort was wasted.
- Define ‘improvement’ in terms of perceptions and ratings of teachers. DO NOT conduct any proper assessments – they may disappoint.
- Only study schools or teachers that recognise a problem and are prepared to take on an initiative. They’ll probably improve whatever you do.
- Conduct some kind of evaluation, but don’t let the design be too good – poor quality evaluations are much more likely to show positive results.
- If any improvement occurs in any aspect of performance, focus attention on that rather than on any areas or schools that have not improved or got worse (don’t mention them!).
- Put some effort into marketing and presentation of the school. Once you start to recruit better students, things will improve.
In other news, Amanda Spielman – ex chair of Ofqual and incoming head of Ofsted – has said in the 2014 researchED conference that school level volatility is both well-known and predictably unpredictable:
Results can be expected to go up or down by between 9-19% every year! Jack Marwood explains this phenomenon in his contribution to What if everything you knew about education was wrong?
Here is Seaside Primary School in North Yorkshire*, a fairly typical two-form entry school. These are the percentages of children achieving level 4 or above in reading, writing and maths:
2013: 77%
2012: 70%
2011: 58%
2010: 69%
2009: 77%
2008: 76%
There is no pattern. Unless a school consistently records 100%, there never is a pattern for any school, in any historical data. This is because the data is based on children’s results, and children are complicated and individual, and the school population in any given school is statistically too small to make meaningful generalisations.
…Long-term trends in something as complex as educational outcomes are – unless you mess with the data by, you know, making the tests easier, selecting by ability or dis-applying certain children from assessment, or simply not reporting stuff – always random.
*Name changed
All this is more pressing now than it ever was. Quite rightly the DfE have moved to reduce the opportunities for gaming the system and Ofsted are getting better and better at not just looking a single measures of success. GCSE grades will be applied on a curve with the top 4.75% always getting a grade 9 and the bottom 4.75% always getting a grade 1. Roughly 40% of students will get grade 4 or below and there is nothing schools can do about it.*
Stupidly though, the government is still insisting that schools need to be above average to avoid being labelled as failing. Schools will tear themselves apart looking for the latest silver bullets but there are none. If a school does especially well in one year – or even two – results will inevitably regress to the mean. No amount of grit or growth mindset can resist this mathematical bulldozer.
As Tom Sherrington says in this post: “rapid improvement is only likely to happen where a) the school was performing significantly poorly in the first place b) the cohort has changed significantly or c) something dodgy is going on.”
Of course, all this may end up meaning that English schools start to do better in real terms. International comparisons and independent studies might tell us over the next few years that the system is improving, but this will be small comfort if foolish accountability measures punish schools for falling foul of statistical inevitabilities.
Broadly, I’m in favour of the changes to exams and Progress 8 is definitely a preferable model for school level accountability than focussing everything on the C/D borderline in 2 subjects, but unless we want headteachers to spend all their time looking for ways to game perverse incentives, the DfE must change the way GCSE results are viewed and schools are treated. The agenda for school improvement has to move away from endlessly pouring over data looking for patterns that don’t exist. We need to find new – better – ways to hold schools to account and come up with new definitions of what school improvement means.
At least – Thank Christ! – we’ve got someone taking the helm at Ofsted who understands the complexity of all this.
*Apparently this may change after the first year. It says here that, “we propose to carry forward the grade standard established in the first award in subsequent years. This will be done through largely the same approach as is in place for pre-reform GCSEs, which is based on a mixture of statistics and examiner judgement.” According to @Yorkshire_Steve this means that if KS2 results go up then the boundaries could change to allow more than 60% of students to get Grade 5 and above. There is, as far as I can tell, no detail beyond that.
Interesting. Maths aged 10 times though- that is going up.
Rob Roe, I salute you! If only the truth would set us free. But, alas, England and/or the USA suffer from the same ‘damaged beyond repair” problems, compounded by an unwillingness (inability?) to STOP the tinkering and attempts to fix the unfixable.
I so wish this was not true. But, after decades of diligent efforts to improve the Education System, I was forced to accept reality – leave it to fail completely, while designing, developing, and building a totally new system that avoids the current immutable factors.
[…] keep school leaders awake at night. It’s a theme picked up by David Didau, who argues in Where Now for School Improvement? that the “agenda for school improvement has to move away from endlessly pouring over data looking […]
So which one is Michaela under 1) 2) or 3) or will they come out as 100% each year?
We’ll have to wait a few years, won’t we. Interesting that KSA’s results have held steady though. Third time lucky this year? Or maybe they’re onto something…
Obviously you are correct, in saying that there will always be year on year variations in results, and that the smaller the school, the greater such variations are likely to be.
But surely, the idea that “long-term trends in something as complex as educational outcomes are … always random” seems to imply that there are NO improvements that can ever be made in pedagogy or curriculum, and that pretty much any teaching method and curriculum will produce the same results in the end. Where is the evidence for that?
While it’s true that there will always be a bell curve, that doesn’t mean that we ‘know’ that the bell curve can never be shifted to the right. Michaela is a good example of a school that is trying evidence-based changes to pedagogy, curriculum, and behaviour management, in order to attempt that.
Of course you can move the curve. Of course you can. But there will still be random variation.
[…] my last post I discussed the natural volatility of GCSE results and the predictably random nature of results […]
[…] In the past, school improvement was easy. You could push pupils into taking BTECs or Diplomas (sometimes with 100% coursework) equivalent to multiple GCSEs; you could organise your curriculum to allow for early entry and multiple resits; you could bend the rules on controlled assessment and a whole host of other little tricks and cons intended to flatter […]
interesting facts about our education systems
“GCSE grades will be applied on a curve with the top 4.75% always getting a grade 9 and the bottom 4.75% always getting a grade 1. Roughly 40% of students will get grade 4 or below and there is nothing schools can do about it.”
I am really struggling to find the source evidence for this – it may just be so long trudging through the DFE consultation documents I am no blind to it. could anyone point me to the source document?