In my last post I discussed the natural volatility of GCSE results and the predictably random nature of results over the long-term. I ended by saying, “The agenda for school improvement has to move away from endlessly pouring over data looking for patterns that don’t exist. We need to find new – better – ways to hold schools to account and come up with new definitions of what school improvement means.”

Interestingly, two readers got in touch to cite the example of Michaela School as a potential outlier. Obviously, Michaela’s first cohort are still a number of years away from sitting their GCSEs and it won’t be until 2018/19 that we will get to see their first set of results. We can, however, look at a number of other schools who have performed especially well with students from poorer backgrounds.

For the past two years Schools Week have been publishing an alternative league table. Here are their Top 10 schools from 2015 and 2016:

2015 results - source: Schools Week http://schoolsweek.co.uk/turning-the-league-tables-onto-disadvantage/

2015 results – source: Schools Week http://schoolsweek.co.uk/turning-the-league-tables-onto-disadvantage

2016 results -- source: Schools Week http://schoolsweek.co.uk/gcse-results-see-disadvantaged-schools-making-the-grade/

2016 results — source: Schools Week http://schoolsweek.co.uk/gcse-results-see-disadvantaged-schools-making-the-grade/

The big story has been the remarkable results achieved by students at King Solomon Academy in Westminster. This is a school with a very high proportion of students in receipt of Free School Meals – a commonly used proxy indicating socio-economic disadvantage. Amazingly, KSA’s results have gone up (93%- 95%) along with the proportion of FSM students (67%-75%).

Now of course, this might just be luck. Maybe they’ve fluked these astonishingly positive results for two years in which case they are bound, eventually, to regress to the mean. Despite the compelling appeal of these figures, two years doesn’t make a reliable pattern. But consider instead La Retraite Roman Catholic Girls School in Lambeth. Their results are not quite so startling as KSA’s but they’ve been around for a lot longer.

According to the Schools Week website, here is La Retraite’s headline data for the last two years:

Screen Shot 2016-07-04 at 11.42.52

Pretty good, right? But what can we find out from looking further back? Thanks to the wonderful people at Education Datalab, I was able to find this:Screen Shot 2016-07-04 at 16.06.10

There are few things to bear in mind about this. First, from 2009/10 iGCSE results were included in these figures and then in 2013/14 following the Wolf Review there was the reduction in the list of eligible qualifications (a max of 2 non-GCSEs per pupil, no individual qualification counting as >1 GCSE etc.)

We can see that La Retraite – a school with high numbers of poorer students – has always out-performed the national average, but there have been a few blips. I can imagine the hand-wringing in 2007/08 as results fell by 8% and then the soaring relief as they rocketed the following year by 11%. Likewise, 2012/13 must have seen a few champagne corks popping and 2013/14 must have felt like a punch in the teeth despite results being 15% above the national average. The thing is, all this variation was almost certainly random and tells us next to nothing about changes in the quality of education the school provided. But, the trend is ever upward – and well above the norm – even in these more straitened recent years.

So, to what can we attribute La Retraite’s success? It’s tempting to conclude it has something to do with it being a girls’ school or a Catholic school. Tempting, because the rest of us can then say, ‘Well, it’s alright for them.’ But is there anything non-faith, co-educational schools can learn from them?

I’ve never been to La Retraite but I do have some anecdotal evidence to go on as my wife worked there about 15 years ago. She said it was an amazing place with very high standards of behaviour and equally soaring expectations of what its students could achieve. These are factors that seem worth trying to replicate everywhere. It also had high degrees of support from parents and comparatively few English students, factors which can’t be so easily replicated. I’ve no idea whether the school is the same today, but results were excellent then and they’re equally good now, so I’m assuming so.

Clearly, you can buck trends, you can do things differently to everyone else and, if you do them well, you can outperform similar schools by a significant margin. But, as Tom Sherrington has made clear, in a zero-sum game, we can’t all do this. In any given year, half of all schools will be above average, but the other half will be below. We can move heaven and earth to shunt the bell curve to the right, but we can’t change statistical laws. Maybe, KSA (and possibly Michaela) provide new templates for school improvement, but if we all do what they do results will still distribute along a curve. Whether this is good news or bad news depend entirely on whether schools are held to account sensibly by people who thoroughly understand the data.