This article is more than 3 years old

It’s time to think about scrapping A levels entirely

The Ofqual algorithm exposed weaknesses that were already in the system. Debbie McVitty makes the case for scrapping national exams.
This article is more than 3 years old

Debbie is Editor of Wonkhe

Over the last few weeks we’ve all seen heartbreaking individual stories of young people who were set to achieve their ambitions to progress to university, overcoming barriers such as poverty, or disability, only to fall foul of the SQA or Ofqual algorithm. Small numbers overall, to be sure, but looming large in the public eye.

Though the government asked universities at the last minute to be “flexible” in making admissions decisions, this request followed months of finger-wagging over unconditional offers, the re-introduction of student number controls, and an apparent attack on the practice of using contextual information in making admissions offers.

Examples like Worcester College, Oxford, of institutions that have decided to ignore awarded grades and honour the offers they have made already are to be applauded – and it would be great if the rest of the Oxbridge colleges do the same.

But Oxbridge is an anomaly in the admissions system, as most colleges make only a handful of offers over the numbers they expect to admit, because in most cases applicants make their grades.

For the majority of universities, to come out at the right place numbers-wise it is necessary to offer more places than you have available, on the assumption some students will go elsewhere, not make grades and so on. For some courses numbers will be fixed anyway because of constraints on placements, equipment or space, so the requested flexibility is not as easy as it sounds – especially with only a few days’ notice.

We’ll see over the next few weeks how this immediate mess might be resolved. Certainly, universities should be making it a priority to admit people where it’s feasible to do so, and especially where there’s indication that their awarded grades are out of step with past performance. The government should let it be known that number controls are No Longer A Thing, or at least that anyone pointing to the A level debacle as a reason for exceeding planned numbers should be let off the hook.

But assuming that this is an anomalous year, and there are no specific lessons to be learned about the politics of awarding grades, might there be some takeaways to apply as university admissions come back into the policy spotlight in the coming year?

One exam to rule them all

One thing that’s been made clear is that the system’s dependence on a single set of exams makes it weaker and more precarious. The whole argument over post-qualification admissions hinges on whether it is possible to conduct university admissions in a time-bound window over the summer – there’s been talking of moving A levels earlier, starting the university year later, and so on. But all this assumes that it’s physically impossible to make a decision about a candidate without having their A level grades.

Had England – like Wales – retained AS levels, universities would have had reasonably recent, validated information on which to base offers for candidates who had missed their grades. AS levels aren’t perfect – even if you don’t agree with Michael Gove’s critique as education secretary that there was too much opportunity for resits to bump up grades – but they’re certainly better than nothing.

But you can take it even further. One of the reasons that predicted grades are so frequently askew is that there is so much riding on them. It’s a teacher’s effort to predict performance in a single set of exams, rather than an assessment of a pupil’s general level of competence or preparedness for university-level study.

The debates about respecting teachers’ judgements miss the mark in that respect – if it’s about predicting performance, it’s reasonable to give your student the benefit of the doubt. You’re judging what you hope they’d be capable of on a good day – and for many students, the days they sit their A levels are not guaranteed to be good days.

There’s also, of course, the issue of potential bias in the other direction, where factors like race, disability and socio-economic status could unduly influence teachers’ assessment of likely future performance.

And let’s not forget that the reason the algorithm disproportionately affected disadvantaged young people is the pre-existing, enormously unjust, gap in attainment between socio-economic groups. The algorithm only laid bare what is already known: young people at independent schools are genuinely more likely to get As and A*s. It’s only this year when it’s an algorithm rather than an exam that’s produced the injustice that it’s playing out as a national scandal.

Scrap A levels

If we scrapped national exams and schools moved to more of a regular lower-stakes assessment and GPA-like system, with regulation of the standards of schools’ awards rather than the national qualifications, universities would have plenty of reasonably robust information on which to base offers.

Whisper it: perhaps many, even all, of those offers could legitimately be unconditional. Being a good student – even being among the “best” needn’t come down to your performance during six weeks in May and June. It could be the mature judgement, backed by evidence, of teachers and admissions tutors.

Schools and universities could work together – as they do now – to create learning opportunities for young people aiming to study particular subjects, but unlike now, these could be credit-bearing, allowing those young people the opportunity to demonstrate their academic readiness to their chosen universities.

There’s nothing particularly special about A level performance – exams suit some young people more than others; some will be having a good day, others not, and so on. And there are many down sides: stress and pressure on the pupils, teaching to the test to the exclusion of other opportunities for personal and academic development, and the inevitable annual scramble to match people to places in August.

There would still be a need for something like Clearing – especially for applicants whose performance had improved towards the end of the school year and those who hadn’t secured an offer, whose circumstances had changed, or who had come late to the process. But there would be much less angst about not having got into the university you’d been planning for and dreaming about for months.

Far too much about the university admission system is based on the idea that performance in a national exam is the gold standard which cannot be challenged. A whole host of young people enter university with BTECs, having quietly completed a number of units with continuous assessment. Maybe it’s time the question “what did you get in your A levels?” was consigned to the dustbin of history.

10 responses to “It’s time to think about scrapping A levels entirely

  1. It all sounds great in theory, but …
    1) to my mind the net effect of those ongoing low stakes assessment would feel to kids as if they were always doing coursework. And coursework is stressy. For any perfectionist young person it would feel like they could never make an error ever – not good for MH.
    2) Not only would it be perpetual coursework for kids, but also for teachers. They’d feel they needed to be always on the case ensuring no child underachieved
    3) Plus coursework type assessments tend to really advantage those with supportive home backgrounds, or with the cash to get extra tuition, or with older friends /siblings who can “help”
    4) coursework type assessments tend to have marks clustering at the high end which means a very small gap between grades and so minor differences are exaggerated
    5) Whilst final exams are certainly an imperfect tool, the exercise of actually learning the material thoroughly for them promotes a better understanding – properly done revision is about consolidating not cramming
    6) If you are assessing summatively almost all the time you are more likely to reduce the time for interesting extraneous material, not decrease it.

    My preferred option is a return of a modular system – at least to the extent of A-level building on AS. That would also help restore the breadth we lost due to Gove’s linearisation.

  2. > The algorithm only laid bare what is already known

    That’s not the case: the algorithm introduced entirely new injustices. That’s why, for instance, Rye St Antony, a poorly-performing independent school with small classes, went from 18% A/A* grades in 2019 (below the national average) to 48% A/A* grades in 2020 (better than every state school in the country). That’s not a problem that was already there; it’s an injustice introduced by Ofqual.

    > It’s only this year when it’s an algorithm rather than an exam that’s produced the injustice

    The exam exposes inequality of various sorts (that’s the entire point of measuring things), but it doesn’t produce it.

    In fact, the situation is exactly opposite to the way you describe it: national exams are a great leveler, allowing the poorest child in the country to compete against the wealthiest and win. The scandal of Ofqual’s algorithm is that that opportunity has been taken away.

  3. I thought the whole idea was that past performance was another means of standardisation. If a school has been over optimistic to ‘chance their arm’, why didnt the past performance standardisation bring is back into alignment with previous years. How does that change get through?
    I get that schools and teachers are on the league table treadmill so will chance their arm, but the algorithm should have spotted that as an anomaly, class size small or large, state or independent all not relevant the computer system couldn’t have had a quality / data check where grades being proposed for a school were then checked against past performance. To catch any school making the most of an opportunity.

  4. Measuring a students’ ability to grasp higher education is a difficult one.

    I bombed in my A-levels due to a number of self-inflicted reasons (Girls, terrible exam technique, crap module grades) and only scraped into an HND on the back of a D and 2x U’s. I had to take a gap year because my UCAS form had been mis-filed by the school.

    But I worked during my gap year for the Year in Industry program and found I excelled at my HND due to a lack of exams and a more adult relationship with my lecturers. I completed a degree and PGCE and went on to become a successful teacher.

    I found University to be easier than A-levels. I found I could recall the knowledge my a-level teachers taught when the time came to write up experiments and lab reports but floundered when asked to do the same in an exam. It helped I could refer back to a textbook if I wasn’t sure about something.

    Based on my own experiences I would say scrapping A-level exams is a great idea. In the real world if we don’t know something we aren’t expected to recall that fact or formula straight away. We can refer back to textbooks and the internet and apply that knowledge to practical situations which helps us remember it for future reference.

  5. Helen, the point about the escaping past performance standardisation is that it did not apply if the cohorts were small. And that is absolutely right statistically.

  6. Carl, actually yes we are in some cases. We can’t look things up the whole time.
    Just to give you a really obvious example – if as a teacher I have to look something up every single time a child asks a question, they’d not have much respect for me. And I don’t see most emergency surgery is carried out checking the procedure on the internet throughout either,

    Of course exams aren’t always the best form of assessment in every circumstance. But that doesn’t mean they have no place.

  7. One of the things the Editor gets wrong is the blaming of “the algorithm” which is no more and no less than a particular set of rules for classifying a result. The problem ultimately was the fact that some test centres were too small for moderation of teacher assessments to be possible. The rules could not be applied so the unmoderated assessments had to be used, creating winners ultimately balanced by losers.

    The logic of this piece is essentially that rules are imperfect and that, therefore, there should be no rules. People that tear up the rule book are to be praised. In short, anarchy should rule.

    The Editor’s piece is also full of U-turns. Number caps, grade inflation, unconditional offers – you can find many WONKHE articles calling for the former and criticising the latter. Now everthing is turned upside down: the cap is bad and grade inflation /UOs are good. Inconsistency – the handmaiden of anarchy.

  8. There is just one thing I would like to point out, although I disagree with other things too. Exams are meant to test your ability to store and apply a large amount of information all at once, in a way that coursework can’t. With coursework, there’s the issue that you can learn all about one topic for a certain assessment, then you’ll have forgotten about it a few weeks later. I’ve done it. It’s much easier than the continual, sustained focus most people require to prepare for final exams.

    When you’re giving a speech, you can’t read a sentence at a time, then go backstage to memorise the next sentence. At a music recital, you can’t spend time learning new pieces halfway through your programme. A lot of tasks in life require you to be in command of a range of skills and information at the same time.

    Yes, putting all of the stakes on a few days with exams is risky, but so are many endeavours in life. I think that the ability to overcome stress and struggles in this way is an unappreciated part of what exams can help teach. And this is coming from someone with very sensitive bowels, who often can’t sleep before an exam. Obviously, I don’t like exams. No one really does. But that doesn’t mean they should be scrapped.

    And if it does go wrong, that’s what the resits are for.

  9. Continuous assessment is not the only alternative to centrally-administered exams. I know the Swiss system quite well. There are no national exam boards: schools set their own assessments, which are then validated by a system of mutual moderation between schools. My friend over there, who is a very experienced (indeed now retired) teacher regularly visits other schools to inspect and validate their annual exam round, including, I think viva voce exams with at least some students. He is accredited to do this. It is rigorous but locally controlled, by people who know their courses and know their students. And there is none of the tyranny of teaching to the test either, since the teachers control what will be assessed! As usual, we need to think wider in this country – what often seem to be “givens” are nothing of the sort.

  10. American school way of assessment is way better than the UK one exam means all approach. Besides that, I like the US credit system too, students are given more responsibility and freedom in choosing subjects.

Leave a Reply