The search for Britain’s worst higher education courses

There are some courses, at some institutions, that students would probably be well advised not to take.

Not my words – but the words of Edward Peck, backed entirely by his panel Chair Philip Augar, speaking at a Commons Education Committee accountability hearing right at the end of the May administration.

Wait… what?

In isolation, it’s a controversial point, but one that betrays a line of thinking that underpins both the Augar report and much recent higher education policy speculation.

Let’s unpack the idea a bit. Augar (and others) see the proportion of a student loan (or student contribution, if you’d rather) that is not repaid as a government subsidy for a particular course. If young Gavin opts to study Social Sciences at the University of Yorkshire and doesn’t earn more than the repayment threshold (currently £26,000 – though the Augar report proposal is to lower that to the median earnings of his age group) for much of his career then the taxpayer picks up the tab.

This, to the Augar panel, is a mistake. With limited funds available, surely these should be used to support courses that turn out graduates with the skills employers and the nation needs? Couldn’t Gavin’s subsidy instead support more students in engineering, or nursing, or artificial intelligence – linked to the skills needs identified government’s own Industrial Strategy?

Heading for the data

Our sudden national interest in graduate salaries has been sparked by the release of LEO data, which shows median salaries by course, cohort, and provider at set points after graduation. The dataset also shows the differences in salary based on background, pre-university attainment, sex, ethnicity, and – almost – region.

Augar draws primarily on a report produced by the IFS, which uses LEO-like data to examine the salaries of a 10% sample of the 1999 cohort at age 31 – with links to tax records from 2001-2 to 2012-13 to get a sense of the differences between salary for different life choices. These differences then get extrapolated into the future to get a sense of total lifetime earnings for this sample.

However, one does not simply extrapolate future earnings based on past performance! The methodology is set out in summary in the IFS paper (there’s also a more detailed academic paper available). I’m as up for a bit of actuarial science as the next wonk – but even my non-specialist eyes can see that there are some big assumptions baked in to these projections. The team used UK Labour Force Survey data to make estimates of salary ranking of an individual within their cohort, which is then used with HMRC data to make an estimate of income.

It’s dashed clever stuff – but even the IFS notes there are numerous caveats. We are, lest we forget, estimating future earnings of current graduates based on the earnings of those who graduated in 1999 and the subsequent earnings performance of people who would have been 18 as far back as the mid-60s. Everything from likely career trajectories to the age of retirement will have changed substantially in this time.

Class of 1999, put your hands in the air!

May I confess to a conflict of interest here. Depending on how you make the cut, I’m in the class of 1999. I may even be in this sample. However, I suspect that my subject of study (English literature and performing arts) and alma mater (De Montfort) – excellent as the experience was – has had less impact on my subsequent career (pause for my mother to put her head in her hands) in annoying people with data and words than other components of what I laughingly call my personality.

My skills (such as they are) may also not have an exact bearing on my salary – I’ve spent most of my career in or around public service, and my later jump to being a full time higher education data nerd and writer of high quality briefing emails hasn’t exactly landed me the yachts and Aston Martin DB11s that the pampered elite HE journalists at other publications may enjoy. But I’m largely happy and fulfilled – and why has this article suddenly become my Annual Review?

I tell this story to make a few points. Salary, skills, and qualifications may – for probably most people – be largely independent. Other life choices (even to the extent of being in the right place at the right time with the right unexpected interests) are going to have an impact too. Tying the course of a career entirely to one choice made age 17 is perverse.

(Not) those terrible courses in full

But all this may not be enough to sway those tempted by Jack Britton’s smooth data mangling. To them, I pose another question – how can we identify the characteristics of these dreadful courses? How can we advise students against them if we don’t know what they are?

Sure – we have LEO, but if LEO had any valid link to course choice as opposed to other contextual variables there would be aspects of the course other than subject that would predict a low LEO. We can all agree on that, but we may differ on what these aspects might be.

Here I’ve chosen low tariff entry (less than 80 UCAS Tariff points – below a C and two Ds at A level)  as a proxy. Not because I necessarily believe it’s the sign of a bad course (spoilers: it isn’t), but because it ties into another bad HE policy meme that Augar’s panel hinted towards embracing – the idea that low tariff entry to HE is inherently sinful and/or of no societal value.

[Full screen]

I’ve plotted data for every single course (where LEO and tariff data exists) available as of July 2019 via Unistats. There is no correlation overall, and there is no correlation at any of the CAH level 1 or CAH level 3 subjects I looked at. Do have a play and see if you can spot any I’ve missed.

There’s no link between LEO to the availability of a foundation year, or the mode of deliver either, in case you were wondering.

Data notes: Unistats is a funny old dataset – expect more on it from me over the summer. It uses LEO and tariff at various levels of aggregation, to compensate for where low student numbers are available for a particular course. These aggregation levels are shown in the data by a handy numeric code, and where they exist I’ve included details. Data is used at either course level, or at CAH level 1 (“Engineering”), level 2 (“Mechanical Engineering”) or level 3 (“Aerospace Engineering”) over one or two years of data.  This does rather undermine the case for talking about courses per se with this data, but as we are somehow happy to give this data (as “information”) to prospective students I thought I’d go with it.

Unistats uses the median salary (for men and women) after three years, and the tariff data (expressed as heavily rounded percentages) is from either the previous year or previous two years, depending on the amount of data that is available.

There’s so many good ways to be bad

So it’s clearly harder to spot “bad” courses from data than you might think, which rather ruins my pivot into Channel 5 daytime TV formats, but also backs up what is generally thought about entry to and prospects from higher education data courses. I’ve got a whole raft of other course level data that doesn’t correlate with LEO either which I was saving for series two.

But making an assumption that course A at institution B is not worthy of subsidy needs to be backed up by something. It’s not enough just to point at LEO when there are so many confounding variables, and where actual attributes of courses don’t seem to match up to salary. If you want to close bad courses – tell us which courses you mean.

8 responses to “The search for Britain’s worst higher education courses

  1. ” being a full time higher education data nerd and writer of high quality briefing emails hasn’t exactly landed me the yachts and Aston Martin DB11s that the pampered elite HE journalists at other publications may enjoy” … who are these people? Most HE hacks are a very long way from being pampered!

  2. Here I was being deliberately ridiculous. I don’t think any HE journalists own a yacht and a luxury sports car.

    Now, vice chancellors on the other hand …

  3. As a 1999 graduate with a Finance degree from a 1992 Uni that was never used, I’m sure I was a great asset to their assumptions…

  4. It is a tough call to make on such assessments. Focusing narrowly on earned income, as they say “prediction is hard, especially about the future”. Even if one believed that that future income was the only criterion of choice, extrapolation from past earnings is likely to be flawed. However that is true of all such decisions, like how to invest your savings. That is not an argument against trying. The real issues seem to me two: what is the motivation for studying and who is paying?

    Some people may choose degree subjects on a hypothesis about future income, others may choose on other grounds. If that were the only issue, it would be a private matter. That is, if the individual student were paying personally rather than the society. If we collectively fund David Kernohan’s degree, we need to decide both what we want from the exercise and how much we are prepared to put into this (as opposed to adding money to the NHS, for example). The latter is the key point. There are evident collective benefits to having an educated population, but that doesn’t mean that everyone ought to be funded to get a doctorate. In essence, we ought to be prepared to put some sum per capita into degrees to generate that benefit. But some of the benefit will be privately captured and it isn’t unreasonable to ask for some form of repayment.

    More problematic to me is that the degree course is far from the only ingredient in future success. Social capital is what motivates students and is very important in them making the most of a university education (of any education). Discussions about fees omit this entirely. Perhaps a small start would be to separate the LEO or analogous outcomes according to indications about the background of the students. It might be quite revealing.

  5. ‘A picture is worth a thousand words’. I’m intrigued as to why the headline image was chosen.

Leave a Reply