Research led by the Institute for Fiscal Studies, released by the Department for Education yesterday has highlighted the importance of university courses in determining graduates’ earnings. But what are the implications for government policy?
Recent higher education policy has highlighted the government’s desire to improve the incentives for universities, supporting the provision of high quality teaching that prepares students for the labour market.
Examples of this include the introduction of the Teaching Excellence and Student Outcomes Framework (TEF), which grades universities based on various measures associated with teaching quality, and the increased publication of statistics on the average earnings of graduates from different institutions.
However, simple comparisons of the average earnings of graduates might be very misleading. Universities are intentionally academically selective meaning the characteristics of their students differ. These differences could have a significant impact on the earnings of graduates. An institution like Cambridge may see very high graduate earnings simply because it takes high ability students – who would have high earnings regardless – and the result may have nothing to do with the actual impact of the education provided.
Comparable outcomes
The IFS report is a significant contribution towards filling this gap in the evidence. By showing the earnings outcomes for graduates of universities when comparing similar students, these figures strip the student composition effect and highlight how the value that degrees directly add to graduates’ earnings varies by institution and subject.
The findings are stark. Different institutions and subject combinations have vastly different impacts on the earnings of their graduates, and despite common perceptions to the contrary, can matter more for earnings than student characteristics on entry to university. Medicine and economics degrees increase graduates early career earnings by 25% more than English and history degrees. Russell Group universities increase earnings by around 10% more than the average degree. The very top universities – LSE, Oxford and Imperial – increase earnings by more than 50% more than the average degree. Even comparing the same subject at different institutions there is a wide range: the highest return business courses have returns 50% higher than the average degree, while the lowest return business courses have below average returns.
This newly available data is a significant step forward. Policymakers can evaluate which courses are good at adding value to students. Meanwhile, students have more information when making their choices.
Visualisation created from IFS data by David Kernohan at Wonkhe. A full-screen version is available here.
But what does this mean for university accountability?
Universities intrinsically care about the outcomes of their graduates so this may highlight areas where they are succeeding or doing less well. Or, if students use this kind of information when making their degree choices, as the government hopes, this might affect university behaviour as they try to attract new students.
Alternatively, the government could choose to use such measures of quality as a direct policy lever. They could link the fees universities charge students to graduate earnings, much like the initial proposal to link fee levels to TEF performance. This would likely increase universities’ focus on the employment outcomes of their students.
In some ways, this would be very positive. Universities which do poorly can learn from those which do well. One theory for the good performance of the University of Bath, for example, is the prevalence of sandwich courses that give students work experience, which is crucial for helping them onto the job ladder. Increasing activities that are successful at improving graduate outcomes could benefit everybody.
But there are obvious drawbacks too. Focusing on employment outcomes might lead to universities neglecting courses or modules which offer lower labour market returns but provide value to society or graduates in other ways. The government must keep in mind the consequences of focusing on a narrow range of outcomes and consider methods to evaluate universities in a way which take into account wider value to society. Furthermore, by their very nature, these measures of value added based on employment outcomes five years after students graduate, are only available with a considerable time lag. This could weaken the mechanisms that incentivise universities to improve.
Improving university accountability is not straightforward and there are many potential unintended consequences to reforms. But we shouldn’t get carried away. The government has simply made this information available, without saying how or whether it intends to use this for policy. And better information available to students and policymakers is surely a good thing.
The IFS report contains this key sentence on p.10: ‘Furthermore, to be valid and useful, measures of the quality of different higher education institutions would need to account for the prior attainment and characteristics of young people on entry into higher education. Without doing so, any measure of institutional quality would be biased and potentially misleading.’ In other words LEO data, if it is to be used in TEF etc. Will always need to be benchmarked.
Not to mention the need to adjust for regional labour market considerations of course. It then all gets very complicated to assess universities using LEO data in a valid way.
Echoing the commenter above regarding the need to take regional labour markets into consideration, the public sector pay freeze also hits certain universities more, while slashes to public Arts funding make life even harder for those graduates who don’t have an independent income. The gender pay gap and sexism account for other outcomes. The message is ‘make sure you are born male, into a privileged family and live in London where wages are higher’. Perhaps addressing structural inequality might be a novel idea now that any residual fantasy that we operate a meritocracy has died.
The report is not a problem in itself and does provide astute applicants seeking economic gain with useful information. However, the press headlines about wasting money on bums-on-seats degrees and low ROI universities suggest to me that the Government agenda is to reduce the loan risk by discouraging applicants to opt for courses which don’t produce the pre-requisite level of salary for loan repayments to start. The potential loss of Humanities programmes and widening participation universities are both irrelevant to the neoliberal enterprise.
It is great to have more of this data in the public domain for analyses of this sorts. However, it is imperative that policy makers are aware of the limits. In particular, I have seen no mention in any coverage of this report that this looks at PAYE employee earnings only. In the creative industries around 30% are self-employed rising to nearly 60% in music and performing arts and 40% in design. These individuals are not included nor are any self-employed entrepreneurs. There is a danger therefore that policy makers, who rely on metrics due to an absence of expertise, make incorrect decisions based on incomplete data. Points on public sector pay are also well made.
The UK is extraordinarily transparent in this area. Although the data have limits, it is easy to take for granted the fact that it is possible to make a comparison across institutions in this way. If there is a problem with accountability in the UK higher education sector and an issue with students making informed choices, most of the rest of the world can be written off as a lost cause. Much of the rest of the UK economy ditto (think for a minute about transparency and accountability in areas such as broadband speeds, car emissions &c &c).
In terms of causal inference, the whole exercise is of course a joke. Years spent at university don’t boost earnings. If they did, doing two, three or four degrees would put a graduate in the position of starting earnings at Vice Chancellor level. Signalling has a one-off boost and in a relatively small number of areas degree choice comes into it as an independent factor.
The authors don’t seem to understand that correlation does not necessarily result from a direct causal relationship between two factors. Universities do not “increase earnings” as suggested here. The fact is, there is a correlation between future earnings and institution attended which is explained by a third factor, class composition of the student body.
@Nadia Renee the report does account for the type of students (prior attainment, socio economic background, region and ethnicity) – that’s one of the major differences between this study and previous ones. The earnings estimates in the figure are compared to the expected salary based on those factors, not a raw average.
Of course you can argue that this doesn’t give a complete picture – the whole point behind UK universities’ complicated admission processes with personal statements, interviews etc (vs something like China where students are admitted entirely based on Gaokao scores) is that there’s more to an applicant’s ability than grades alone. But it’s not fair to say that the authors haven’t considered class composition.