The release of the latest batch of Longitudinal Educational Outcomes (LEO) data has prompted a robust response – configured as a parliamentary briefing, no less – from Universities UK.
Wonkhe fans will know that we generally take the position that LEO is a fascinating research dataset if you’re into studying changes in the labour market over a timescale. But for some of the wilder policy purposes put forward – supporting prospective students in making choices of subject or institution, directing funding council finance or other investment, making judgements about “good” and “bad” universities and courses – there are so many caveats that need to be attached to it that the temptation is not to bother.
You’ll never believe number six
Universities UK lists ten reasons why LEO should not be used, as below:
- LEO doesn’t account for the difference between full or part time work.
- LEO doesn’t account for the region a graduate is working in
- External economic trends have an impact on LEO.
- LEO excludes a lot of information about self-employed graduates
- Graduates who left the UK (or have not paid tax in the UK) after completing their studies are not included
- LEO doesn’t account for social or cultural benefits to the graduate.
- Other factors (more than institution and subject) have an influence on graduate income.
- LEO doesn’t handle multi subject courses
- LEO doesn’t account for societal benefits
- Graduates may be satisfied despite lower earnings
I don’t think any of these will come as a surprise to anyone who has followed the debates as we’ve gotten to know the data. But it is possible to condense these ten (and more) into just one caveat:
LEO is an indicator. It’s not an exact measure, and it isn’t a prediction.
Exactness is not always a virtue
By indicator and not a measure I mean that when we use LEO, the aggregate data and the trends in the data are more important than the absolute values. The fact that economics graduates tend to earn more than health sciences graduates is interesting. The difference, in pounds, between the two subjects.
In plotting aspects of the data, there are other trends that we should be taking note of.
- Men still earn more than women (in every subject, after every time period, and accounting for all other variables)
- High prior attainment is correlated with higher earnings overall, though for some subjects (education, health and social care, nursing, medicine, architecture) the correlation does not appear.
- POLAR quintile and free school meals eligibility show a mild correlation, with more disadvantaged groups showing lower earnings. This is still present, though less noticeable, after ten years.
- Part-time and sandwich courses correlate with an earnings premium from five years after graduation.
- Students that live at home during their studies have tended to earn less.
- A home domicile in London and the South East correlates with higher earnings – a home domicile in the North correlates with lower earnings.
- Asian students from an Indian or Chinese ethnic background have seen higher earnings, though white students catch up after 10 years.
- Combined studies students – likely a heterogenous group – learned less after 10 years than those studying other subjects.
For each of these there could be many confounding variables – but this points to the need for further, interesting, (and likely qualitative) research. Into what happened in the past, which is my other point. The more alert among you will have spotted that all of these findings relate to the past rather than predict the future, and all of them point to underlying inequalities in the labour market over that period, rather than changes in the value of particular degrees.
In an evidence-led policymaking world, we could use LEO data to interrogate inequalities in the workplace. Not just the continually shameful gender pay gap, or the impact of social background on the careers of graduates. I mean the wider cultural issues that mean we don’t value nurses, artists, and teachers as much as we value doctors, economists, and mathematicians.
One response to “Salary data doesn’t pay the bills”
Was this a freudian slip 🙂
“Combined studies students… learned less”