I blame the free market for exam board cheats

For secondary school teachers like me, training sessions run by the exam boards are invaluable. And I’ve attended plenty of meetings where there have been strong hints about upcoming questions, similar to those exposed by the Telegraph this week. I’ve never heard an examiner being so open about the sorts of topics that the exam would cover, though: I blame the free market system for this behaviour. Exam boards are very anxious to keep their customers satisfied, and perhaps they think it only fair to give a little extra. After all, those attending are paying good money – often hundreds of pounds – to go to a short talk.

In the majority of cases, though, your school stumps the cash for you to go because, even if you don’t get a copy of the forthcoming exam paper, these meetings give you a vital insight into how to improve your pupils’ grades; something your career, your pay packet and your school’s future depend upon. The main function of these seminars is simply to translate exam board jargon, often so bewildering in official documents, into friendly English.

The scales have fallen from my eyes in a few of these meetings. For example, a few years back I taught an A-level English language course and my students had been getting consistently poor results in one unit; I’d read and re-read the bumpf – the mark schemes, the examiners’ reports – but it hadn’t helped. Then I attended a meeting and was told a few key things, among which was the importance of fostering genuinely personalised responses among my pupils; and the examiner explained how to do this. I changed the way I taught and was rewarded with considerably better results.

This anecdote is significant in the light of the furore caused by the Telegraph’s undercover reporting. On the whole, exam boards are not telling teachers exam questions so that pupils can be spoon-fed the answers, but quite the opposite. They are reassuring teachers that the questions are predictable, in order to try to persuade them that students will get the best marks if they come up with their own ideas rather than producing copy-cat generic answers to what are essentially generic questions.

This “cloning” is happening because all GCSEs and A-levels are now marked by measuring the degree to which pupils meet the relevant assessment objectives (AOs); these are essentially the key subject skills. Exam questions are shaped by the AOs; this means the questions are rarely surprising. For example, in my own subjects, English and media, I feel confident about the types of questions that will be asked in the exams, even if I can’t be sure of the exact wording.

An examiner could therefore point teachers in the right direction – perhaps misinterpreted by the Telegraph as telling teachers the questions – without feeling that he or she was saying anything new; most teachers would have guessed what questions were going to come up anyway.

Exam boards have put a renewed focus upon “personal response” (somewhat ironic considering their questions are anything but). This is a noble aim but I’m not sure the current regime of exams is delivering the originality of thought that we all want. Exhaustive academic research into this area was conducted by the Assessment Reform Group in schools. Its report, Fit for purpose? seriously questions the validity, reliability and cost-effectiveness of our national assessment system, finding it to have a negative impact upon the quality of teaching in classrooms, pupil motivation and “genuine” standards. While I don’t agree with everything the group says, I think its powerful, evidence-based arguments have never been taken seriously enough by the powers that be. Let’s hope this latest debacle leads on to a more serious debate about the role of exams in our schools.


Related reading