Graduate Outcomes PIs: past, present, and future

There are currently no UK-wide performance indicators for graduate outcomes. Ben Cooper details how we got here, and what might happen next

Ben Cooper is a Business Analyst at Manchester Metropolitan University and Data Insights Director at the Association of Graduate Careers Advisory Services (AGCAS).

OfS’ outcomes focused phase of the regulator’s quality and standards review is nearly upon us – and development of performance indicators promised as part of the looming consultation.

As the regulator works out how best to assess graduate outcomes, it’s helpful to reflect on historical interpretations of success, why we don’t know what success looks like now, and what the future vision is.

First destinations, first steps

While the intense focus on graduate outcomes within the sector is a relatively recent phenomena, collecting and assessing information on university leavers’ destinations is not. Institutions were doing this for their own purposes long before a new (ish) sector statistics agency introduced a formal collection exercise in the mid-1990s and HESA’s First Destinations survey was born.

In 1999 HESA crunched First Destinations data to create the first official UK performance indicator (PI) for graduate outcomes, focusing on the proportion of leavers who entered employment shortly after university. Progression to further study was added in 2011, ten years after First Destinations evolved into DLHE which enjoyed a 17-year reign as the sector-standard measurement of graduate activity before being replaced by Graduate Outcomes in 2018.

The revised UKPI survived throughout, even with the obvious flaw that in failing to distinguish between the types of work graduates went on to, it didn’t really provide a meaningful measure of performance. Leavers in low-skilled or low valued jobs counted equally as those in “graduate level” roles.

Part of the reason HESA’s metric never evolved is because it didn’t have to. Despite offering the analytical advantage of sector benchmarking (and providing eager marketeers with impressive stats about student success), more useful ways to evaluate performance were found elsewhere. By the time providers started to pay really close attention to graduate outcomes as drivers of reputation and recruitment, league table rankings which first appeared in the 1990s had long established summary metrics from DLHE data. These calculations involved various ways of recording progression to highly skilled employment or further study and were widely adopted by institutions as internal PIs. Later they formed the basis of analogous approaches to provider assessment when a new regulatory era of higher education ushered in the OfS and TEF.

Present tense

So much for the history of PIs. Before looking to the future, it’s worth pausing on a rather astonishing present. While “good” graduate destinations has become perhaps the dictum of provider quality, 18 months after Graduate Outcomes results were first released and three years since data collection began, there is still no official stance on measuring this.

HESA were widely expected to lead development of a new UKPI to match the new survey, having announced plans to do so in early 2020. Eyebrows were raised back then about the perceived late timing of this work, but more widespread surprise followed when it was abandoned in 2021 as part of HESA’s move to distance themselves from performance measurement. HESA’s position is understandable (although unfortunate because its expertise and influence is significant), but early disclosure would have been welcome to encourage more timely progress elsewhere.

OfS is likewise culpable for delaying the process and were promising imminent work on PI development in consultation with the sector when survey results were first published. But without immediate regulatory need for one (TEF would have looked absurd using DLHE data when Graduate Outcomes data was available, but the old exercise never really progressed past 2018), urgency drifted before things were swept up in the wider overhaul of quality and standards.

History repeats

Without much official direction or innovation from within the sector and with new rankings to produce, league table compilers again led the development of performance measures for the first Graduate Outcomes results, which typically involved reworking the old DLHE metrics on highly skilled employment and/or further study. HEIs, needing a recognised means of assessing and comparing performance adopted these measures internally before OfS, without making much headway in developing a PI in consultation with the sector, wrangled a version into (experimental) Start to Success and Proceed provider assessment measures. Sound familiar?

The introduction of Graduate Outcomes should have prompted a fresh look at graduate destinations and construction of a more nuanced interpretation of what “good” outcomes look like, one which considered new perspectives, advocated by a range of conversant sector stakeholders including providers and representative organisations like AGCAS.

This could have included undertaking a more detailed and contextual examination of activity, assessment of contribution to key skills or civic needs and, crucially, consideration of graduates’ own views of success. But however we define “good”, I’d like to move away from the idea that it can be usefully reduced into a single binary measure based on crude classifications of current employment and study.

This approach might have served well in the past but we’re in a different place now, with different data and where judgements about outcomes performance have different and more profound implications. It’s not just providers who matter here either. It doesn’t seem desirable for choice among a diverse student body to be influenced by such a condensed construal of success. Nor does it appear useful to construct a limited, UK-wide PI for four regulators who seem to be moving in rather different directions. Graduate Outcomes data (and the environment it sits within) is complex. Stakeholders need more advanced performance indicators to help make good sense of it.

Future hope

Arguably PI development should have started years ago as part of the planning for Graduate Outcomes. But does that matter right now when OfS is about to take this work forward?

Possibly. I worry that an endeavour which deserved dedicated attention will be lost within the wider piece and that with historical interpretations of performance recently embedded in the sector, a single stale definition of success will be mooted as the starting point for OfS proposals. If it is, let’s hope there’s time and scope for genuine consultation and consideration of alternatives. We’ve long been without a good approach to measuring outcomes and can hold out a little longer to get one.

5 responses to “Graduate Outcomes PIs: past, present, and future

  1. Excellent article Ben, effectively summarising the hopes and fears of the sector, and the need to finally move away from imposed objective measure of ‘success’. Agree that institutional and individual contextualisation is critical to truly assess the impact that a university education has on our graduates’ futures.

  2. Excellent summary – the need for consultation is vital, this is a costly exercise and the data needs to deliver meaningful data, with transparency of purpose.

  3. A thoughtful article – thank you Ben. It may be helpful to note that HESA’s decision on discontinuing the UKPIs ‘branding’ does not mean that the Agency is not looking in detail at how to construct new and better measures of Graduate success – this is a matter of wide interest as your article demonstrates. In addition to our Statistical Bulletin and Open Data, over the last couple of years we have published research on career satisfaction by ethnicity and salary returns by degree classification. Work like this is foundational in building scientific understanding of the factors relevant to achieving successful graduate outcomes. We have a plurality of dependable measures. If the ‘one true measure’ of graduate outcomes is possible, it is critical inquiry, experiment and empiricism that will eventually reveal the contours of its design.

    I would particularly like to draw readers’ attention to the paper we published on measuring the design and nature of work, utilizing the Graduate Voice questions. This paper takes inspiration from the outcomes from the Measuring Job Quality Working Group. Feedback so far has indicated this approach is worthy of further development. We would welcome further engagement with the ideas in the paper by all those engaged in trying to discriminate between good and bad outcomes, and securing the best possible outcomes for policy, practice, and most of all for students. You can find it here: https://www.hesa.ac.uk/data-and-analysis/research/statistical-measure-design-nature-work

    1. Thanks Dan – it was great to talk about HESAs exploratory work on the data with you recently and really look forward to hearing more about this in due course.

  4. Great summary Ben. The flaws with the surveys have been much discussed. But they do also give us very useful data and insights. Consultation is key to improving measures and response rates for all graduates.

Leave a Reply

Copyright © 2021 Wonkhe Ltd.

Company Number: 08784934

Wonkhe Ltd, Lower Third Floor Evelyn Suite, Quantum House,

22-24 Red Lion Court, London, United Kingdom, EC4A 3EB