This article is more than 2 years old

OfS’ new metric will help students make sense of career prospects

The new Proceed metric is an experimental tool to help separate marketing promises from the reality of achievement and progression. OfS chief executive Nicola Dandridge explains the rationale.
This article is more than 2 years old

Nicola Dandridge is the Chief Executive of the Office for Students.

Students decide to go into higher education for many different reasons.

It might be for the pleasure of learning, or to meet people and enjoy a wider university experience – which we hope will soon feel much more normal – or to get a degree which then leads to a fulfilling career.

Whatever their reasons, and however they decide what course and what subject is right for them, we can all agree that students should be able to access good, independent advice and information while they weigh up their options. That is why the OfS has published a new measure which helps to project how likely students are to both complete their course and then go onto find professional employment.

The measure – Projected completion and employment from entrant data (Proceed) – will provide important information for prospective students. We tested it with universities and colleges, with careers advisers and with the OfS student panel, and it was felt that the measure would be useful to prospective students and their advisers.

Developed with the sector

I know that the measure is controversial with some. It is possible to argue that career outcomes should not be a measure of success. And it is also possible to dispute how OfS have made the calculations.

I accept the first point though it is worth noting that this is how many universities and colleges market their courses -and students should have as much impartial information as they can to assess the veracity of marketing claims. In terms of the calculations, we have adjusted the methodology since we published the data anonymously in December 2020. The changes we have made largely reflect the feedback we received from universities and colleges. It has been a useful and productive exercise, and we continue to seek feedback on what remain experimental statistics.

The changes we have made also help set the data in context, and recognise the challenges in deciding whether particular outcomes should be seen as positive. For example, we have added a sector adjusted average to help users compare the performance of similar types of providers. We have also changed the way in which we assess the outcomes of those graduates who are travelling, caring or retired – making these neutral rather than negative outcomes. And the measure no longer assumes that a student who transfers from one provider to another as having had a negative outcome from their first provider.

A welcome uplift, in context

The result of these changes is that – in those cases – most providers will see that the percentage of students expected to find professional employment has increased when compared with the December 2020 publication. More importantly, the changes mean the quality of information on offer to prospective students has improved.

We recognise that there remain limitations to the measure. We make it clear in the report where results should be treated with particular caution. We understand that the true benefits of a degree are realised through someone’s career, and not just 15 months after graduation. There are also regional differences in the proportions of highly skilled jobs, as well as differences between different groups of students. So it is important to see today’s data – at both a provider and at a subject level – as just part of the picture. But it is an important part, and one that we believe students will want to see.

The OfS does not currently use this indicator in our regulation of individual providers, and any future proposals to do so would be subject to further consultation. We are currently considering responses to our consultation on the regulation of quality – and subject to our analysis of these responses – will set out next steps soon. We are clear though, that students – whatever and wherever they study – all deserve good quality teaching which prepares them for a fulfilling life after graduation, and the publication of this data helps to contribute to that outcome.

7 responses to “OfS’ new metric will help students make sense of career prospects

  1. It’s really important to contextualise the PROCEED data. The college with by far the lowest PROCEED score teaches a single degree programme in a profession that is not coded as ‘graduate’ and they hold a TEF Gold award. Gavin Williamson’s comment that ‘the data proves there is much more work to be done’ is only part of the story. Data without interpretation is just numbers.

  2. I am not sure that the assertion that: “the changes mean the quality of information on offer to prospective students has improved” stacks up.

    Institutions who do well on PROCEED do so in large part because they teach students from middle-class backgrounds with excellent A-level results who have just left school and are from and return to work in London and the Home Counties. I am not sure how much help knowing that students with these characteristics do well in later life (who knew?) helps students who are from more disadvantaged backgrounds. It is like looking at absolute A-level results and progression to elite universities of ultra-selective grammar schools and concluding that these schools must provide a far better education than other schools to achieve such results.

    Contextualisation is crucial in allowing individual students to know the outcomes achieved at an institution/course by *students who are similar to them* and so make an informed choice. This is why, in the schools sector, the UK Government utilises value-added measures to at least control for prior attainment.

    1. @Pete, you’re quite right, of course it doesn’t improve the information for a student making a choice that’s best for them. But that’s not really the point.

      The point of HE (according to government) is to qualify people for jobs rather than, say, providing them with a bundle of perspectives that help them immunise themselves against their own natural stupidy. Much of the sector delivers a lot a lot less of this value — as ably demonstrated by the metric. HE is really cost-inefficient at pumping out jobbing graduates.

      Controlling for variables such as prior attainment, ethnicity, age, gender, etc., leads you to conclude that some institutions are doing quite well, relatively speaking. But these institutions are still absolutely worse than the prestigious institutions performing relatively poorly! Their method allows them to cut the tail off the snake, improving the overall cost-efficiency of HE at getting people into jobs.

      Naive though this is, I do wish they’d be honest with the sector (annoying though we can be) and give us a real diagnosis along with their treatment, rather than have their comms team come up with apparently plausible spin.

  3. Hello,

    Is there a convenient place where I can check which professions or job titles fall into the definition of “professional employment”? I quickly looked through the OfS website but I couldn’t find anything.

    Thanks!

    1. Jason – It is SOC major groups 1-3 see https://www.ons.gov.uk/methodology/classificationsandstandards/standardoccupationalclassificationsoc/soc2010/soc2010volume2thestructureandcodingindex, we have also included Teaching Assistants and Veterinary Nurses (SOC codes 6125 and 6131). Full details of our methodology can be found in our publication (See https://www.officeforstudents.org.uk/media/b4bd5b29-0ddb-4e68-9ebf-811c111f150f/proceed-updated-methodology-and-results.pdf)

  4. Also, really worth noting the stats compare two entirely different cohorts of students, comparing a graduating year with grads who left two years ago. I suspect many courses change dramatically over a couple of years, up and down in quality. Therefore it doesn’t really help people to make informed choices.

Leave a Reply