This year’s A levels will be based on a teacher assessment of the standard at which a student is performing.
According to the joint Ofqual and DfE consultation, students should have the right of appeal – and there are measures to ensure private candidates (those who study for A levels outside of a school or college setting) can get grades too. A parallel consultation postulates that for vocational and technical qualifications (VTQs) those that would have received a calculated grade last summer would be assessed with “adapted assessments” – punting this problem to awarding bodies via an alternative regulatory arrangement. In this piece I’m primarily focused on the A level consultation.
Trouble is, having cancelled exams, DfE seems perversely determined to have exams.
Finding meaning in grading
Grades must reflect what a student knows, understands and can do, and they must be widely understood and respected.
Though this quote makes sense on a surface level, this is actually a fairly radical policy departure for DfE. Apart from the Centre Assessed Grades awarded last year, every year since 2010 has seen students receive grades that situates them within predetermined grade boundaries in comparison with their cohort. A student that would have gained an A in 2016 may only get a B in 2017 – the same knowledge, understanding, and ability is recognised differently in different years. If they were good at exams.
This time round, teachers are not asked to predict exam grades (as they were last year) or had the pandemic not occurred. It’s all about the level of performance.
The big news for candidates is that there will be a final assessment made towards the end of the academic year (round about the usual exam time) but this will be based on a set of papers (or other non-exam assessments where these are already in the specifications) developed by awarding bodies and administered by teachers. The bulk of the consultation considers how these papers are developed and what training and support is provided to teachers – but it is important to note that these papers don’t mean everything – teachers are to
Draw on a range of broader evidence of a student’s work in making their final assessment.
So this consultation is also a retreat from nearly a decade of insistence that only final exams can fairly assess students.
There’s also a very rough timeline:
- Assessed by teachers in May and early June
- Grades to exam boards in mid-June
- External QA (more on this to come) in June
- Results issued in July (not August as is usually the case)
- Appeals follow immediately, first to schools and colleges.
How should teachers decide what level a student is working at? My first thought went to the kind of level descriptors that were onced produced by exam boards – so for English literature a student who displayed confidence and skill in choosing and using appropriate quotations from a secondary text in responding to a question on the interpretation of the play-within-a-play in A Midsummer Night’s Dream was probably working at least at an A standard at that point. In the 90s this would have worked – and indeed, revision guides would contain this kind of information. These days, not so much.
Years of norm- rather than criterion- referenced assessment have eroded this fine-grained ability to the extent that many teachers used performance on last year’s A level paper as a basis for CAG grades rather than referring to grade descriptors. There are, to be clear, guidelines like this towards the back of qualification specifications (here’s the one for AQA English literature A, the levels start at page 33) but these link to numbers of marks for non-exam assessments rather than the grades someone is working at.
There’s loads in the consultation about when students should assess the level at which students are performing, but the assumption is that the standardisation is to be done via a system of common assessments rather than common standards. These papers look likely to be similar in style and format to “normal” exam papers – and would cover a “reasonable proportion” of course content. The consultation leaves options open for these sets papers to be compulsory or optional – and suggests teachers (and thus students) should get early warning of what topics will be covered. The thinking seems to be that teachers would set and mark sections of papers based on what has been covered in class – there’s some suggestion that each student should be assessed on a similar proportion of content.
Non-exam assessments do have these level descriptors attached as a matter of course – and it is suggested that these will not be moderated this year. Other performance evidence needs to be documented if it is used to derive grades – and for teacher-designed assessments a mark scheme would be needed.
You may turn your laptops over now
Perhaps buoyed by its stellar run of predictive accuracy over the past 12 months, DfE reckons that these exam-like papers will be taken in school in something approximating a traditional exam setting – “in line with public health guidance in place at the time”. The hedge is that if the pandemic makes it essential papers could be completed in an alternative venue, including a student’s home. The student, and anyone supervising them, would have to make a declaration that no “unauthorised assistance” has been received.
What’s missing here? Well – everything. There’s no account taken of the need to design assessments that make “unauthorised assistance” less of a factor, there’s no consideration given to the fact that a kid on a pimped-out gaming rig in their own bedroom in a quiet house would make a better fist of an assessment than someone perched at a kitchen worktop on a school laptop tethered to a spotty 3G router in a family of six.
Having cancelled exams, DfE seems determined to keep things as exam-like as possible. Not only does this defeat the point of cancelling exams, it loses the good points of formal exams – a level playing field, a country-wide embargo on papers – while retaining the bad points that make exam taking a particularly middle-class success story.
DfE should be planning for remote assessment as a default. There are ways of doing this via intrusive monitoring (see the many US scandals) but there are also ways of doing this via clever assessment design. Universities did this last year and are doing it again right now – maybe someone in DfE should have picked the phone up or read the (excellent) QAA guidance.
Do you agree or disagree that teachers and schools should be provided with support and information from exam boards? You might think I’m asking a rhetorical question, but it is right there in black and white as question 31. The only question about support and information directly, though there is a chunk about internal quality assurance that fills in a few gaps.
A key part of internal quality assurance within schools should be a declaration by the head that exam board requirements have been met. Schools are expected to agree their approach to assessment within a set of choices provided by the exam board. There will be a quality assurance of approaches taken by schools by the exam boards, but it doesn’t appear that any moderation at exam board level would take place by default – just some sampling of evidence at subject level. Exam boards would not be permitted to change submitted grades unless evidence has been reviewed and following discussion with the school. The mutant algorithm has thus been slain.
On appeals, these start with the student appealing to the school – the school or student may then get to appeal to the exam board – on the standard “this hasn’t been marked correctly” lines. There’s a nod to grade variability for non-exam assessments, the default is that the original mark given can be supported, it should stand. A lot of appeals are clearly expected, which is the reason results day could be in July.
Finally, for private candidates, the preferred option seems to be that students should take the papers set by exam boards, and should work with schools for other requirements. Question 55 and 56 are nearly identical – suggesting “normal” exams in the summer or autumn would make the whole issue go away.
The HE angle
Higher education providers are specifically named as people who should respond, and you have until 29 January at 11.45pm. It’s difficult to know what view you should take.
They say you should never see laws or sausages made, after what we’ve learned this year I would add A level grades and university admission decisions. On one level, universities have to use whatever grades come out of the end of the machine – most unconditional offers are still forbidden should you want to base your admissions on anything other than this process. With this mindset, the consideration needs to be whether these arrangements will produce grades that do not unfairly disadvantage specific groups of students – and whether we have any assurance that students will arrive on campus in 2021 with the skills and aptitudes we would usually hope for.
On another level, universities at practitioners of a very different form of education and assessment than the compulsory sector will have a lot to say about fairness and the purpose of grades. It would be easy to do this to excess (I’m going to take a punt and say Gavin Williamson has not read any Freire and that he never will – I suspect he may not even have read Newman) but there is maybe scope, with a review of admissions going on too, to prod gently in the direction of assessment that benefits the student, explicitly favours aptitude and skills over knowledge and recall, and is based on something other than warmed over memories of those spaced out desks in the gym.