This article is more than 2 years old

OfS goes on the attack over “unmerited” grades

The Office for Students thinks that many of the grades awarded to graduates are "unmerited". Sunday Blake has serious concerns of her own
This article is more than 2 years old

Sunday Blake was an associate editor at Wonkhe

For those who are not familiar, each year the Office for Students (OfS) uses an algorithm which looks at a provider’s historic performance and changes in cohort demographic and predicts the percentage of students that should be graduating with a first or an upper second class degree on a provider level.

The algorithm predicts outcomes on an individual level by taking into consideration factors such as a student’s race, gender, socio-economic background, and whether they have a disability. We then see the aggregate of this calculation compared to what actually happened.

If more students receive these grades than anticipated, then it is deemed “unexplained” – or this year, “unmerited.” Contentious.

Inflation or Improvement?

In the decade between 2010 – 2020, OfS found that the percentage of full time undergraduates attaining a first class degree doubled. Now it has released an updated analysis of attainment statistics to include the academic year 2020-21,  and has found that students attaining a first or upper second class degree is up from 67 per cent in 2010-11 to 84.4 per cent in 2020-21. The two key items:

  • Three-quarters of providers have had significant increases in the awarding of first and upper second class degrees.
  • Almost all providers saw a significant increase in the awarding of first class degrees

A sparkling endorsement of improved teaching standards or something to be “tackled?” OfS asserts the latter. The regulator sees the increase as “unexplained” and therefore jeopardising the “credibility, value, and reliability of the qualifications” which can lead to a loss of public confidence in higher education – and has vowed to take a strengthened regulatory approach to tackle grade inflation.

[Full screen]

Variable factors

The statistical modelling approach that OfS uses was “adapted to account for the sector-level changes in awarding patterns” this year, due to the impact of Covid-19 lockdown disruptions. This is all well and good but there are multifactorial reasons for improvements in attainments, particularly in the last year.

The pandemic did not just change institutional awarding behaviour but also provided lessons in accessibility. Universities have risen to the challenge of ratifying – or in some cases, introducing – support for students with disabilities and caring responsibilities. In 2020-21 hundreds of institutions also actioned responses to conversations on gendered violence and safety on campus; the surrounding cities spurred by the tragic murder of Sarah Everard; and further concerns surrounding drink and needle spiking. There was further anti-racism advocacy and action after the murder of George Floyd, and widening conversations and progress towards decolonisation.

To me these are “observable factors” which may impact attainment. Is it so inconceivable that this activity would improve the learning environments in which disadvantaged students can work and would also alter the outcomes?

At a sector-level, we expect students to learn and grow as they progress through their degree, and we expect educators to upskill, and institutions to develop better understanding surrounding learning and adapt their provision accordingly. Since 2010 we have seen enhanced understandings of neurodiversity, and increased use of digital learning resources and learning analytics, all of which go someway to helping students reach their potential. And, it seems like a sector which continuously strives to improve – and was very well funded by OfS’ predecessor to improve in the previous 30 years – is surely a sector that should hold the confidence of the public.

Natural progression

The thing is, I do get it. Even if teaching techniques, the learning environment, and higher education culture improves to the point that every student is able to attain a first class degree, we will still need ways to decipher which of those have truly excelled.

After all, at one point in time, pre-1870, we divided the country into those who could read and those who could not, which is a rather silly metric now that we have universal education. So a solution is needed. Perhaps we need to date stamp degree classifications for context. Perhaps we need to include percentile breakdowns. Perhaps we need an entire new classification system that reflects the strong progression we have made as a sector.

Because in principle, I agree with the OfS B4 condition, which came into effect on 1 May 2022. I do not think that “the same level of student achievement should not be rewarded with higher degree classifications over time.” But only if we are absolutely clear that this is what is happening. Because I also don’t think that disadvantaged students should be bound to benchmarks set a decade ago in different socio-political circumstances, and in a sector with less understanding of teaching and learning, and accessibility.

But, regardless of what I think, OfS has, once again, pledged to investigate “unexplained attainments” – that is, attainment that deviates from the OfS algorithm of what students should achieve. Condition B4 will allow the OfS to use its regulatory powers to look into the reliability of a provider’s assessment practice and credibility of their academic regulations, and intervene if necessary. So we all have that to look forward to.

13 responses to “OfS goes on the attack over “unmerited” grades

  1. Having recently retired from a 45-year career in university teaching, I have seen an enormous improvement in how students are taught. Course content has been made more explicit. Students are given better information about what is expected of them. Study materials are available online. There is more recognition of different learning styles. Most of these improvements have taken effect within the past 15 years, and these probably account for most of the variance in academic attainment.

  2. It’s a genuinely difficult one to square with the push to close attainment gaps.

    On a more mundane level, there have also been significant improvements in the consistency of marking, with a move away from the idea that some people just don’t give firsts. That coupled with the drive to use the full range of the markscale (which historically didn’t so much happen outside of STEM) has pushed numbers up.

    But, ultimately, and whatever the reason for changes, the DfE and therefore the OfS are not going to stop until grades come back down (presumably to the arbitrary ‘correct’ levels of 2010-2011). So that pressure will remain, and it makes good red meat to feed the culture wars in the press.

    To be honest, not having a first class band three times the size of other bands would probably have the biggest impact, but I’m conscious that requires wholesale cultural marking change, and everyone amending their student record system, regulations and policies, and everyone’s a bit tired and busy right now.

  3. The growth in use of artificial ‘contextual’ adjustments and unrecorded verbal instructions to differentially mark identifiably different students to achieve ‘equity’ is a major driver of this ‘grade inflation’ or more general dumbing down, as most University businesses are only trying to ensure future numbers and income it’s unsurprising that with the P.R value of the aforesaid ‘equity’ this is happening. Having sat through graduations where other graduating students on the same course have commented that a person who never attended lectures, was late and never submitted required work etc yet got not just a First but a ‘special’ award as well because of who/what they were I think the Ofs has very good reason to look closely at ALL Universities, and the actual teaching qualifications (good luck with that) of those delivering the courses.

  4. I agree with the previous posters that teaching and consistency of marking have both improved over the 34 years I’ve taught in HE. This makes achievement much more accessible to all students. The use of ‘unmerited’ is unmerited here. Students and teachers are doing better. If this is a problem, then let’s talk about standards a bit more, and not resort to the promotion of norm-referencing grades.

    One way to reboot this discussion would be to change the grading system to something more logical, with evenly sized grade bands. Let’s just start again. There are many reasons why our current system is not fit for purpose. For one, using the full range of marks has much more impact in the first class band with 30 marks available than it does on the pass, 2.2 and 2.1 bands, and so there is a disproportionate impact. I know I’m not suggesting anything new, as I’m old enough to remember Burgess, R. (2007). Beyond the honours degree classification: The Burgess Group final report. U. UK. http://www.hear.ac.uk/reports/Burgess2007

  5. One of the contributing factors is the decline in quality of External Examiners some of whom I am sure just sign the paperwork and never look at the scripts assuming the internal moderation process is robust.

    1. The EE system has always been hit and miss. There was a suggestion to provide a national training scheme not long ago. Much needed! The increase in first class degrees is a result of a combination of things: better learning and teaching (a good part of the increase); NSS and league tables (better grades = happy students); recruitment demands (also related to last point); and the expansion of degree apprenticeship/ work-based degrees (too easy to get high scores). I missed off changes to degree awarding algorithms because I think it may have had a minimal impact. The influence of the size of the bands is relevant only because the mantra ‘use the full range of marks’ became over-used. Perhaps we should analyse outcomes by thinking in terms of first and seconds – which is where we began many years ago. Whatever the case, the classification system is part of our heritage, if we abandon it we lose part of what makes us what we are (so we won’t).

  6. In order to be consistent, surely the OfS needs to investigate why admissions into universities have been skewed so heavily in favour of privately educated applicants. The evidence is strongly in favour of this being an advantage borne of socio-economic circumstances, rather than inherent ability or better pre-university education. So perhaps we should correct downwards the grades of students from Eton and Winchester…? Otherwise, this is a rather selective recognition of grade inflation…

  7. I am still trying to unpick the methodology OfS uses to predict degree outcomes. Does anyone with a better grasp of stats than I understand whether it would mean a university with a large number of minority ethnic students from lower socio-economic groups, would inevitably be identified as having high unexplained grade inflation if increases in first and 2:1s were observed over the years? Does the methodology presume that white students will do better in their degrees? And if this is correct, why is the OfS ‘baking in’ this disadvantage in their analysis.

  8. I think this is pretty spot on demonstrating some of the potentially many moving parts that can affect individual and cohort level degree attainment over the years. While the OfS have made some effort to consider what factors could explain the rise in degree attainment, this would just be the extent of factors that the OfS have thought of or are practically able to measure and control for – the idea that they can claim the remaining difference evidence of “unmerited” grade inflation is at best premature. To me, it seems that the way to determine how much of the change is “merited” would be to remark assessments from 10 years ago but I am not aware if this type of study has already been done or is planned/ongoing anywhere.

    1. A good article, and I think you hit on something in your comment too. As I understand it, “unexplained” is used in its statistical meaning in the OfS report. They consider only student attributes, and acknowledge that the explanations may be found in factors that they don’t consider in their statistical analysis – improved teaching and all the rest. The move from “unexplained” to “unmerited” is then not a conclusion that can be sustained, no matter how true it is that awards should be merited.

      More fundamental explanations, I think, are to be found in changes brought about from about 2010: increased fees, competition, and student consumerism.

Leave a Reply