What’s the point of LEO in 2023?

LEO - it promised much, but in regulatory terms has delivered little. David Kernohan wonders what went wrong

David Kernohan is Deputy Editor of Wonkhe

The Longitudinal Education Outcome (LEO) dataset was officially launched in a blaze of publicity in Jo Johnson’s 2016 White Paper (Success as a knowledge economy), and the sector’s wonks were quick to spot policy implications.

The ability to mash together tax and benefits data with individual student records offered a new way to examine student outcomes many years after graduation, and allowed us to map these to subject of study, mode of study, and provider. Previously such insight was limited to the triennial longer term DLHE survey – this was an in-depth look at the real, individual, benefits of higher education.

Though we know LEO as a public data release – the real benefit has been felt by the lucky few approved to analyse the raw data. Most notably, a number of influential Institute for Fiscal Studies reports on the longer-term benefits of a degree (and the longer-term costs to the government) draw on a combination of LEO data and cohort studies.

What you could have won

However, I think many of us were expecting LEO to play a much bigger role in regulation. If we ignore (as most people do) the Discover Uni resources, we look in vain for LEO data in Office for Students publications or dashboards (you can, if you wish, use it contextually in TEF submissions and APPs, it’s also one of the 8 million things OfS could decide to also look at when making a decision to investigate based on B3).

The initial intention was very different. Around 2012 David Willetts saw what would become LEO data as the key to a more nuanced approach to student loan repayment.

Imagine that in the future we discover that the RAB charge [loan non-repayment rate] for a Bristol graduate was 10 per cent. Maybe some other university … we are only going to get 60 per cent back. Going beyond that it becomes an interesting question, to what extent you can incentivise universities to lower their own RAB charges.

In his wilder moments, Willetts went beyond the language of incentivisation.

I expect that, in the future, as the data accrue [sic], the policy debate will be about the RAB charge for individual institutions

The idea here was clearly the calculation of individual subject and provider level RAB charges – basically the difference between how much a graduate is expected to repay and the amount of money allocated to them – with the idea of differential eligibility or terms for different students. Willetts – and those who keep the guttering flame of the immaculate purity of the post-2012 finance system alive – have shied away from such language in recent years, but if you’ve been following the “low quality courses” debate the ghost of the idea is still there.

It’s good but it’s not right

So why didn’t it happen? LEO has always been controversial – but the pushback in mainstream debate back in 2016 was moral and not technical. Many people made the case that a degree (or the student experience more generally) conferred more than just a lifetime salary uplift. For treasury purposes, of course, this uplift had more to do with the overall affordability of the student finance scheme – graduates that earn less, pay back less.

Meanwhile, LEO itself was showing some of the flaws of this mindset. Female graduates earn less than their peers, and the salary gaps experienced by ethnic minorities and those from disadvantaged backgrounds are real. Even some of the subject area gaps – part of the ostensible “low quality” rhetoric for years – didn’t bear analysis when you took into account low public sector salaries and the portfolio careers (LEO assumes one full time job is the source of your salary, and doesn’t scale part time earnings) that artists are forced into if they wish to continue their practice.

The final nail was really the addition of work on regional salary differentials – a talented graduate from a disadvantaged background in the north east may choose to study and work locally, even though a better salary (if not a quality of life) is available in London. Indeed, this choice was briefly (via the levelling up agenda) government policy.

If you penalise an institution or subject (perhaps via limiting loan eligibility) based on salary then as well as hitting whatever you might define as “low quality” you cause collateral damage to a lot of provision that you do want.

Out of the black and into the red

The end point of thinking is the binary “progression” indicator used by the Office for Students – which skirts neatly around the problem of low paid professional public sector roles by defining them as a positive outcome (a graduate job, in other words). As we’ve been over on Wonkhe before, the use of Standard Occupational Coding (SOC) as the arbiter of graduateness is not an exact science – and for treasury purposes it also weakens the link between outcomes and repayments.

Rumours about a conclusion to the Higher Education Reform consultation, issued way back in February 2022, have started to resurface after months of silence. The mood music is that we’ve moved away from both student number controls (except as a reminder of OfS’ existing powers) and minimum eligibility requirements as a means of shaping the sector to focus on the government’s preferred provision. It looks for the moment that the OfS’ “boots on the ground” work (investigations without a rubric, formal announcement, or clear intention to report) is the only way the government intends to combat the scourge of graduates not earning enough to pay back student loans on a subject or provider basis.

The failure of this LEO based-plan (though, to be strictly accurate, LEO is still on the table for potential regulatory purposes), is what – arguably – has left us with the mess that is Plan 5 loans. And it may go some way towards explaining the curiously regressive nature of the scheme. By design, those who take courses that lead to low graduate earnings will pay more, while those who move swiftly into high earning jobs get the subsidy. To most of us, unfair – to the government, an incentive for applicants to choose “better” courses.

Play your cards right

LEO isn’t going anywhere, at least in the short term. The heavy lifting of primary legislation has been done, and at this point what remains a useful administrative database for research can be seen as a gift to the world of think tanks and Tableau fans.

[Full screen]

First up, let’s take a look at some of the lesser seen data by subject to spot other areas where LEO is subject to externalities – salary by subject in real terms. The pattern to look for here is largely stasis, with a small drop in the pandemic years as everything went weird. So if you are sticking by the contention that some subject areas are just no good at getting graduates a good job, you need to reckon with the idea that this has largely stayed static for more than a decade. Which, I guess, you could parlay into an argument that all universities are awful (though I hear the by-word rates at Spiked aren’t great) – but is more likely to indicate that the UK graduate job market is rather like an oil tanker (very slow to change course, and rapidly becoming obsolete).

If you’ve read this far, what you are probably after is data by subject and provider – so here I’ve added a proxy for the preferred OfS measure (the proportion of graduates in employment, further study, or both) to the standard median salaries plus quartiles display. Here, we see that salary and what we might call “graduateness” are separate attributes with little relationship – and recall, of course, that the salary of someone in further study is likely to be low. Even given the low bar for recent DfE ministers, nobody has yet made the case that graduates who immediately have to go off and do another course may not be the best investment for the government – but the case is there to be made.

[Full screen]

This defaults to the most recent year of data for 1 year after graduation – you can look at any year of interest but it is fiddly – select “all” on the top three filters and start from there.

Of course – despite the mildly shonky nature of LEO data – the quality and standards push from DfE isn’t going anywhere either. Ministers and officials still very much see the rooting out of “low quality courses” (however defined) as a key animating factor for higher education policy – the department (and most likely the treasury) stands fully behind the “boots on the ground” work of the Office for Students in this area.

Poor quality courses are (assuming they lead inexorably to poor lifetime earnings) literally more expensive for the government to support, because of the reduced rate of repayment. Willetts’ early speculation about a more nuanced RAB calculation based on LEO-style data as a way of directing investment has proven unworkable, but everything we’ve tried subsequently has also proven unworkable.

For universities themselves this should be a concern – in providing a costly self-investment opportunity, you should really be able to talk credibly around returns – but for DfE and the treasury it is a brake on ambitions to “rightsize” the sector’s undergraduate offer. For me, I’d keep an eye on the work of the DfE Unit for Future Skills – though as every careers professional knows future skills demands are notoriously difficult to predict.

3 responses to “What’s the point of LEO in 2023?

  1. Great article with some very interesting context around the use of LEO and totally agree with the gist. LEO gets a hard time from providers but there’s nothing wrong with it per se, problems emerge when trying to use it to measure performance at provider level. Thankfully we seem to have arrived at fairly comfortable resting point for LEO in this context, not really going anywhere but not really doing any damage either (which is not necessarily how things were shaping up a few years ago). Potential (mis)use in shaping high level policy where it should shine feels a bit more of a threat.

  2. I think a big benefit of the (cost-neutral) Plan 5 reforms – aside from the political attractiveness of playing on misunderstanding of the student loan system to make abolishing interest and reducing student debt sound like a progressive reform – is that, while DfE forecasts suggest it will lead to a small reduction in mean lifetime repayments (from £24,700 under Plan 2 to £24,300 under Plan 5) and leave the RAB change unchanged, it will radically reduce the transfer cost of the loans which is how the ONS accounts for their impact on the public finances because (down from 52% in Plan 2 to 30% in Plan 5).

    This is because there is a lot less interest accruing on loans from those who do not repay the principal in full to be written-off up front. However, this will ultimately be balanced in the public accounts in 20 years or so by lower income from high-earners who repay the principal and no longer have to pay interest too.

    Impressive ingenuity to find another way of gaming the student loan accounting system tbh

    See https://explore-education-statistics.service.gov.uk/find-statistics/student-loan-forecasts-for-england

Leave a Reply