Why everyone has their own graduate outcomes metric

As Graduate Outcomes data for 2020-21 is released, Ben Cooper wonders why it is so hard to get from the open data to the regulatory indicators

Ben Cooper is a Business Analyst at Manchester Metropolitan University and Data Insights Director at the Association of Graduate Careers Advisory Services (AGCAS).

The open release of Graduate Outcomes data will have been eagerly anticipated by colleagues across the sector who are keen to understand the latest employment trends for 2020-21 graduates and, crucially, understand how their results stack up against those of other institutions.

Given that this is the fourth annual Graduate Outcomes release, getting to grips with this sector data should be a relatively simple matter – but it feels more complicated than it might be.

In previous years, operating HESA’s open data tables has presented some difficulties for even experienced users. And while there’s hope that Jisc will have helped HESA tackle some of the technical gripes (including a strange inability to choose and retain a consistent set of demographic filters), more fundamental frustrations around the way the data is cut are likely to remain.


Here I’m referring to the different approaches to breaking down study profiles taken by HESA and OfS. The latter’s quality framework views provision in terms of sixteen primary study modes levels and providers who have tuned into this way of thinking to help monitor quality compliance and evidence excellence could reasonably expect to see these splits duplicated across HESA’s outputs. Instead, there are several mismatches such as HNC/HND qualifications in HESA’s tables which are not considered separately by OfS, and no mention of apprenticeships or integrated masters courses which are.

The disparity actually runs a little deeper because it’s not just the cohorts which are different but in some cases the way they are defined too with OfS and HESA having slightly different ideas on how to derive level of study from student records. Notably, there’s also a glaring lack of data on the OfS progression measure in the tables so those who are understandably eager to explore their relative position according to the sector’s “official” Graduate Outcomes performance indicator will have to wait until a separate update to OfS’ data sets to do so.


These observations may not constitute errors in everyone’s eyes. HESA and OfS may legitimately point out that HESA’s data reporting structures existed long before OfS’ quality framework, highlight HESA’s intentional step away from involvement in performance indicators and the need to serve stakeholders from across the UK, not just OfS regulated English providers.

Nevertheless, it does feel more than a little odd that there’s not some functionality to align the key outputs from the sectors’ dominant regulator and its designated data body. Future iterations of HESA’s Discover Outcomes data dashboard available through its Heidi Plus data package could potentially address this, but it feels like a rather distant and exclusive proposition at the moment. Even providers who typically turn to HESA’s bespoke data sets to develop advanced analyses of Graduate Outcomes are unlikely to find easy access to a progression marker therein.


This isn’t the only instance where approaches to the presentation of Graduate Outcomes data from the sectors’ key players appear a little disjointed. Last week’s direction from the DfE and OfS which instructing providers to point prospective students to Discover Uni as part of course advertising was a welcome change from previous guidance which advocated the use of the OfS’ long-obsolete “Proceed” data set. But over at the Discover Uni website you’ll see headline outcomes data designed to guide good student choices consisting of earnings data and a measure of work and/or study based on HESA’s data. Dig down and there are further breakdowns of activity and employment by skill level captured by the survey but no sign of OfS’ calculated metrics.

There are also examples of some arguably less than helpful thinking from OfS within their own sphere of operations. Having parked the experimental Proceed measure when establishing a new quality and standards framework, it could be judged excessive to resurrect a similar version into a crowded metrics space last month but “CEED” now sits alongside OfS’ internal KPIs which already contain progression and completion strands. Turning to the regulator’s recent work on access and participation, Graduate Outcomes data takes a central role with progression judged to evidence risks to student support and mental health as well as comprising a risk itself. It’s strange though that the regulator’s concern seems to be as much about further study as employment, when interim study is specifically excluded from positive outcomes in the progression metric.

Why can’t we be friends?

My key point here is not to critique OfS’ progression metric but to highlight some complexities around presentation and interpretation of key Graduate Outcomes derived data outputs from HESA, DfE and elements of OfS. Furthermore, I’d suggest that the sector could benefit from increased dialogue and collaborative thinking between these bodies to help address these.

This feels like a necessity for a well-established data set at the forefront of sector policy and regulatory landscape. While accepting an inevitable need to use survey data for a variety of purposes and welcoming nuanced interpretations of results, aspects of current applications and approaches are creating inconsistency and potential confusion.

For colleagues familiar with these intricacies, this means burden in ironing out and explaining differences around the data for others in their institutions. For less familiar users trying to make sense of the wealth of information around graduate destinations, these issues can be more challenging to navigate.

3 responses to “Why everyone has their own graduate outcomes metric

  1. Great summary, Ben. I think as soon as HESA releases some graduate outcomes data their is a clamour from within HE to establish performance and plan. However, because of the fragmented approach these analytics are drip fed and make planning to meet OfS requirements all the more disjointed.

  2. Thanks Ben, totally agree with your sentiment and frustration. HESA are caught between a rock and hard place with trying to comply with devolved regulatory authorities and have been pushed into a corner where they want to open data out but not the extent that providers across the UK can understand measurement and compliance. It sort of helps no-one. Only solution is to have different designated data bodies across nations who support those regulated providers specifically with open date that actually lines up with their regulatory responsibilities.. We too are frustrated of caveating HESA data to explain the mismatch with eventual OfS versions…

  3. This should be a positive day for uk unis – unemployment down, employment up etc but the nature of the data makes this really hard to get a message out. Uk graduates are 90%++ (often 95%) into employment or further study, 90%++ progress into second year – some strong metrics. Meanwhile govt are pushing young people into apprenticeships where drop out rates are 30%+ according to FE week and others.UK HE is highly efficient and successful!

Leave a Reply