(*This is a version of this week’s editorial)
I hate school performance measures.
I’ve hated them for a long time. Not because they do a bad job, but half the time when I run a story involving figures and schools, lots of readers get annoyed.
It goes back to league tables, or performance tables, as they became. They used to come on floppy discs (ah, those long-gone days), which you could only get after promising the Government to surrender at least one limb if you revealed their secrets ahead of time.
The discs contained tables and field separators and codings, and details of every school in the country, so after spending two days working out what was what, extracting the relevant data for local schools and double checking it, I’d end up with a rather small table of results. I’d sort the table in various ways, mainly to feel better about expending so much effort for so little result. If I could rank each of five columns, that meant five different tables and it didn’t seem such a mind-numbing waste of time.
As soon as the tables appeared, the schools in the bottom half of the table would be on the phone pointing out the system’s various failings, like if you had two special needs pupils in a class of 30, it would drag the whole class down.
There was always the school we missed out; with so many numbers and symbols, one always got forgotten and then I had to invent a better reason than boredom and incompetence for the omission. I stopped bothering with tables after a very nice retired head called up and spent some time explaining the failings of the system. Now performance tables go to the Press Association, to which we do not subscribe, so we don’t get them (so much for localism).
There are Ofsted inspections. Teachers hate being inspected. I suspect it’s because Ofsted inspectors are aliens. This can be the only explanation for the really weird thing they’re able to do: when a report on a school is good, the teachers love their Ofsted inspectors and can’t wait to tell us all about it. Conversely, when a report is bad, Ofsted inspectors are fools and charlatans, the report is out of date, and was only ever a meaningless snapshot of school in the first place.
(Just to be petty: It’s all right teachers complaining about snap inspections but we get inspected every week. However hard we try, it’s impossible to eliminate mistakes, and someone will always write in and tell us. Everyone makes mistakes at work, but we’re the only people who have to print a public acknowledgement of ours. I doubt we’d get away with just saying that any mistakes were not a reflection of our work that day . . .)
Anyway: only an alien life force could force anyone to hold such contradictory views at the same time; human beings surely aren’t that erratic on their own.This attribute of Ofsted has several predictable effects. It means a school that does well is straight onto to our newsdesk demanding a big report, because newspapers are always full of bad news and it’s our duty to show something in a positive light.
When they don’t do so well, it’s left to us to check Ofsted’s website and then run a report, in which case we get people complaining that newspapers are always full of bad news and we should report things in a more positive light.
Worst of all, and this is the only time this all really matters, when a school is put into special measures, when it’s in everyone’s interest to rally round and find out why a school is letting down our greatest asset, no-one tells us anything. The school, parents and governors certainly don’t want the news getting out but it seems odd that even Ofsted, set up to ensure standards are met, doesn’t bother telling people when they’re not.
Back in the days of the otherwise excellent Cheshire County Council, it actually lied to us when we raised concerns about a school. I called them up and ask point-blank if there was trouble at the school and was told no, despite the fact that they were in the process of sacking the head.
Six months later the school was onto to our newsdesk wanting a story at its fantastic success of getting out of special measures . . .
All this is because we recently ran a story in the Biddulph edition about stage two SAT results. One school had apparently performed better. The school that did well immediately spoke to our reporter and we later had letters of praise for a fair and balanced story. The school that apparently did less well didn’t return our calls and then complained most strongly.
The article was “irresponsible and lacking in knowledge of the way schools are judged,” said the headmistress, though she admitted: “performance did dip this year”. “It is over simplistic to use league tables as the sole measure of a school’s success,” she added, something we agree with completely. Which is why we hate school performance measures.
So here’s a solution, to all schools in our area not just in Biddulph: Form an Ofsted Committee. When an Ofsted reports or SATS or any other figures appear, contact us. If it’s a bad Ofsted (and Ofsted’s satisfactory is now not satisfactory) or one school is lower in a “league table”, give us some explanation and quotes. We’ve got no agendas against any one school or another, we just have a duty to report the results — more precisely, we have a duty to be fair, so we can’t report the good reports and ignore the bad. If one school gets a glowing Ofsted report, we treat it the same as a school getting a less glowing one.
After all, a glowing report is just as much a meaningless snapshot as a bad one.