It is with a certain level of angst, amusement and apprehension that I await Ofsted’s new Inspection Framework. The angst and amusement are combined; our Curriculum Policy, originating in the last decade, initially contained four main sections: the Curriculum Model, the Planned Curriculum, the Delivered Curriculum and the Received Curriculum (read intent, implementation and impact). It was pretty much dismissed by an inspection team at the time but I was delighted to see the terms resurfacing.
The benefit of our inspection system is that after a day or two the inspection team leaves you alone for another few years (unless they decide you require improvement or are inadequate); you can then reflect upon which bits of the report are wisdom and which bits are waffle. As a gnarled ex-Curriculum Deputy I wasn’t prepared to give up easily on the curriculum, an area of education that had always fascinated me. We decided to carry on with our curriculum journey alongside the other challenges of leading and teaching in Blackpool.
Our particular journey wasn’t really helped by the qualifications system which was largely modular and unintentionally led to “a remember it for the test, then forget it approach” in many teachers’ and pupils’ minds. Likewise, the Law of Unintended Consequences may soon haunt the new Ofsted Inspection Framework, if it is implemented in the format suggested by various leaks and statements. Take for example the stated “… fact, with much of that (school’s) internal attainment and progress data, they and we can’t be confident that it’s valid and reliable information.” For decades the quality of your internal data has helped determine whether you were outstanding or inadequate with significant consequences either way. It’s now being proposed that a school’s internal data will not form any part of the new inspection process.
Leaving aside the confusion above over the terms reliability and validity and in the hope I don’t get into too much trouble for this, a true story: in one inspection I gave the inspection team our predicted grades, based on teacher assessments, for a particular key stage. Late on in the process they gave me them back suggesting that they were a little low and I may want to consider increasing them. As an obedient servant of the inspection machine I duly did so and the team were very grateful for my efforts. Whilst you might be outraged by this, I genuinely don’t know which of the two sets of data were the more accurate and reliable; the ones the teachers made up or the ones I did. My sense was the team were now collecting the data needed to validate what they had already decided. Ofsted are right to question the reliability of school’s internal data but need to question much more their proposed replacements.
With a school’s internal data off the table what might we see take its place? This is from Ofsted’s third curriculum research paper, “Lesson observation was not an in-depth or central part of the triangulation process – the work scrutiny and pupil discussions carried more weight. This was beneficial in ensuring that inspectors looked at the quality of learning over time and not individual lessons. This is not intended to devalue the purpose of observation and we expect it to carry considerable importance for helping to assess the quality of education in the new framework.”
What will be fascinating to see is the research behind the reliability of work scrutiny and the pupil discussions as well as lesson observations especially when carried out over a very short period of time, often by non-subject specialists. If they are not reliable we will have another invalid set of conclusions and grades being assigned from dubious data. There is also a book scrutiny workload monster waiting to emerge from this to replace the workload data monster or even in addition to. Interestingly, Ofsted’s research so far for this new framework hasn’t looked at inter-rater reliability with respect to any judgement including, critically, the new quality of education one.
Ofsted has thankfully engaged some really high quality people. These experts will know that things can’t be boiled down to a single overall grade; it is more complex than that and we really can’t be that certain. It’s time to stop the grading of schools farce. My constructive suggestions is that we stop trying to evaluate schools, it is simply too complex and the conclusions too uncertain; we should focus solely on improving them.