The introduction of the Small Business, Enterprise and Employment Act, as one of the final pieces of legislation of the last Parliament, paved the way for educational data to be linked to data from HMRC to examine the outcomes of leavers from education.
The Green Paper then stated that employment and earnings information derived from this linked data would be used as part of the common metrics that would feed into TEF for the purpose of informing applications for higher student tuition fees.
All of this has led to speculation that the venerable survey of graduate first destinations, which has been conducted by the sector for over 50 years and which is currently incarnated as DLHE, might be dropped. HESA’s announcement that outcomes data are under review has therefore excited a certain amount of comment. Why review something we no longer need? What is happening?
What the Small Business, Enterprise and Employment Act allows is the linkage of educational data to tax records, which means that, potentially, people allowed to access tax records can track someone’s educational journey through school, into HE, and then into the labour market and get accurate earnings information through their subsequent career. As the provisions state, this will be used to produce data on the returns to study of individual subjects and at individual institutions.
But there are good reasons that this isn’t the DLHE-killer some people have painted it as. One reason is data coverage. HMRC data will give us excellent information about salaries, and when linked to elements of student record data, this will potentially allow a vivid picture to be built of earnings by all kinds of characteristics – not merely institution and subject, but gender, ethnicity and disability status, amongst others. Previous educational experiences can be brought into the mix. But it’s not clear how good the information on other aspects of employment outcomes will actually be. Industry and occupation data are not guaranteed to be of the quality of the data collected through DLHE, and employment location may only be limited to head office data. Many very good objections have been raised about the use of salary data as a metric for university performance and salary data alone simply isn’t enough for the many stakeholders in university outcomes data. We need more and better information.
But there are more subtle reasons why it is advisable for the sector to continue to collect outcomes data. The first is the question of ownership. HMRC data is, obviously, not under the ownership of the HE sector. Indeed, such are the restrictions on how the data might be used it is possible – even likely – that nobody in higher education will ever be able to access the full raw data. We may be able to obtain extracts, partial datasets or processed data – the details are still to be worked out – but anyone dreaming of using the full data will probably have to moderate their ambition. On one level, this is not an issue – indeed, it may be of value for HE to be assessed using data that nobody can accuse institutions of manipulating in any way. But data that is not administered or controlled by the sector cannot be modified or adapted to suit changing needs. Our own outcomes survey allows us to ask questions at a national level that can then be used by the sector as a whole or by individual institutions that may not be directly related to earnings.
The second issue is more subtle. DLHE data is collected with the informed consent of the participants. The same is not true of linked HMRC data, collected, as it is, by Act of Parliament. We can’t very well ask for students to actively consent to their tax data being used by institutions – I am no expert, but I can’t help feeling it would be a pretty tough sell – and so data that comes from consenting participants may still be valuable.
DLHE has certainly come in for a good deal of scrutiny of late. It was reviewed relatively recently, in 2010, but the subsequent rapid expansion of public metrics has led to the data no longer meeting the needs of a much-expanded group of significant stakeholders. The 2010 Review largely drew upon the needs and experiences of the careers and employability community who were (and in many cases still are) responsible for collection and were the largest consumers of the output of the survey. Careers professionals remain an important stakeholder group and one whose experience of collecting and using outcomes data should not be sidelined, but this information is also crucial to other sector stakeholders and any data must meet more diverse needs.
And this is our opportunity. A space remains for a sector-owned, population level outcomes survey, collected from an audience with their knowledge and consent, which can address the current sector needs. The most troublesome aspect of DLHE – the collection of salary data – is no longer vital. Everything else, from the reference date (we will probably move away from six months) to the sample coverage, is up for grabs. Shall we keep a focus on employment outcomes? Should we take a more qualitative view? Can we use a survey of this nature to examine more fundamental questions about the individual’s perception of the value of higher education?
2016 will almost certainly see the last DLHE as we know it, but the long story of UK higher education outcomes data is not yet over.
Excellent article. Two thoughts:
• We should also consider the international angle: what graduate career outcome metrics might allow international comparisons.
• I’m never comfortable with the idea of salary being equating with career success. (If I learnt more, perhaps I might be.) It’s rare that people base career choices on salary alone and often, they do so in spite of it. Think of social workers, nurses, charity workers, even academics for that matter.
While the potential to use HMRC data will be a huge boon to analysis of the labour market and the effectiveness of education in meeting that market’s needs, we must remember that this is only one way of looking at the issue and not necessarily the most relevant to individuals’ needs or choices.
Very good point about the international angle Johnny. I didn’t cover that as I think we can all agree that DLHE’s international coverage is not that great, and because although it’s a consideration in the new review, we’re pragmatic enough to realise that collecting really good international data would be very demanding of resources.
Salary is not a good measure of career success. The ways individuals regard salary in career decision making is complex and not reducible down to ‘higher=better’.
I agree that the HMRC data will have real value – although I think the long-term value is more likely to be the ability to examine educational trajectories from very early on. It is possible to examine data back to the 80s, and for much earlier levels of education, and I think it could have exciting potential applications in examining social mobility, for example. But for university careers and guidance I see it more as being a supplement to the existing knowledge base rather than the One True Data Source that trumps all the others in terms of utility, reach and meaning.
I agree with you Charlie, the HMRC dataset is no replacement for DLHE. There are four groups in the population:
1. those that pay tax/claim benefits and are graduates and can be matched to the HESA student return,
2. those that pay tax/claim benefits and are graduates and cannot be matched to the HESA student return (overseas graduates as an example)
3. those that pay tax/claim benefits that are not graduates and cannot be matched to the HESA student return
4. those that do not pay tax/claim benefits.
Since many graduates cannot be linked to the HESA student return (group 2) the HMRC data tells us absolutely nothing about those that get a degree versus those that do not get a degree. However I can almost guarantee it will be badly used to try to prove this point.
For those that do pay tax and can be matched to the HESA student population, we know nothing about what they are doing, is it full-time or part-time work (is that £16,000 actually a phenomenal salary for someone who works just 10 hours a week), what is their job title (surely this matters to current students that want to know what their peers went on to do), where in the country are they (yes, HMRC really don’t know this, is your £18,000 a good salary for central London).
We already know all of this, it comes from one of the largest annual censuses in the world, one that is the envy of almost every other educational system and yet we (as a society) rubbish it continually.
I’m excited to be part of the review working group as I think there are some important changes that can be made to DLHE but there is absolutely no chance that it can reasonably be replaced by the HMRC dataset.
I disagree with one statement you made in an otherwise excellent article:
“but the subsequent rapid expansion of public metrics has led to the data no longer meeting the needs of a much-expanded group of significant stakeholders.”
Who are the new stakeholders and what are their demands, the only one I can think of is the Teaching Excellence Framework, the current system of measuring outcomes based on graduates into the labour market puts those institutions that are traditionally vocational and embed high quality employment skills into their curriculum top of the table, however these are not the ones you would traditionally expect to come top of a league table.
There have been a couple of bad news stories of DLHE being manipulated which have been quickly removed from the internet after (presumably) threats of litigation. This is to be expected when there are over 100 hundred institutions having to fight for an ever decreasing pot of money, it happens in every sector when money is brought into the equation and is controlled with effective regulation, not by throwing out a hugely successful and important tool.
So no, DLHE has not had its day, it just needs better regulating.
The weaknesses of DLHE however are legion:
1) It’s a snapshot only a notional six months after graduation, many graduates have yet to get into their preferred area of employment this early in their post-graduation career; Longitudinal surveys at least several years later (and repeated) are necessary to get an accurate picture of the changing nature of graduate employment.
2) self employment and ‘portfolio’ jobs (multi-employers) are much more widespread than in the past especially for some sectors, such as creative arts, and salary data for the self-employed only really makes sense if it’s compiled on a 12 month tax year basis – DLHE’s deadline isn’t even before the start of the first full tax year after graduation for July graduates, let alone at the end of it; One size fits all doesn’t work and especially at institutional level, eg the specialist creative arts institutions are heavily handicapped by the current DLHE.
3) Jobs are changing all the time, and the very basic info collected by DLHE is inadequate to classify in sufficient detail the type of work undertaken, particularly the division between ‘graduate’ and ‘non-graduate’ jobs. [I’m not saying HMRC can do this either – we need some kind of basic job evaluation questionnaire, collected repeatedly over time.]
4) International movement within the job market is a hugely increased phenomenon and we shouldn’t forget that there are large numbers of students from outside the UK already included in the DLHE population for whom the absence of meaningful comparison of post graduation employment overseas renders the data very difficult to use.
5) The DLHE data is rarely analysed in terms of context (value-added/pre entry quals, subject, gender, ethnicity etc) and instead used by league table compilers as if institutions can just be compared on a ‘like-for-like’ basis. This is not just a problem with DLHE but the with the lack of use of proper statistical methodologies; however government is at fault in using these league tables to determine what constitutes a ‘good’ university.
Now I’m not saying that handing the whole of the problem to HMRC is the solution, because as the original article points out there are major problems with that as well. However defending the indefensible is not the answer. The HE sector needs to come up with a long term plan for collecting robust data on graduate employment that combines various methodologies – some from HMRC, some from institutions, some from graduates themselves on a longitudinal basis, always analysed in terms of context. We should certainly start a pilot of something different based on a sample of say 20,000 graduates to track them and report on the best long term solution. A modern twist to the idea of ‘Seven-Up’?
Mike
To take the issues in order.
1+2. The point is taken to an extent, but rather than explaining the issues with six months, we need to hear specific, robust arguments about why another reference date is obviously better. These need to take into account that the further you move from graduation, the more of the sample will become difficult to reach, the more that data will cost to collect and the less the coverage. As LDLHE shows, a survey after 3.5 years can be less useful than one after six months if there are not enough people in it. That said, I think we are moving towards a different reference date, which will come with its own issues. There is no ‘right’ time to do the survey and they are all, for one reason or another, imperfect. We hope that if we change the date it will be one that is less imperfect although the one thing I can predict with absolute certainty is strongly-expressed opinions as to why six months is better than the new date.
I’ve recently seen data from Professor Emma Jones at Leicester that suggests that DLHE data isn’t a bad predictor of longer-term outcomes, at least for STEM graduates. The same may not be true for arts graduates and I wonder if their experience is starting to diverge sufficiently that a specific tool might be necessary to look at it effectively. I think we’d need to see strong, robust evidence of the special nature of arts career tracks to make that judgement at this point though.
3+5. The issues here are not, as such, with the destination survey. In 3, the issue is with the Standard Occupational Classification used to code occupational data in all employment surveys in the UK, and in 5 with the way the data is used by Government (the data is analysed in a great many ways, including those you mention although clearly the output needs better publicity).
If we need to code occupational data – and we certainly do – and if Government wishes to use the data, these issues will always arise and will not go away if we develop a different tool. We can develop something that has an eye on these questions though. The SOC question, which I agree is absolutely an issue, is one rather close to my heart.
4. I like the idea of an international tracker of employment but with the best will in the world, I am not sure anyone has the resources to do this effectively.
With regards to your longitudinal pilot suggestion. We have had at least 3 longitudinal tracks of graduates conducted by Professor Peter Elias and Professor Kate Purcell at the IER at Warwick – one on the 1995 graduating cohort (Working Out/Moving On), one on the 1999 graduting cohort (Class of ’99) and the Futuretrack survey that tracked 2005/6 applicants to university through to their post-graduation career. All had sample sizes larger than 20,000 (which is too small to be useful here). They are good surveys which produced meaningful outcomes, but they were very costly, and much though we’d like it not to be, cost is a consideration. As you note in your earlier points, the graduate labour market changes rapidly and profoundly, and as a consequence, I am sceptical that tracking currently graduates past a certain point has long term value in terms of informing the experience of their successors. I am not sure that examining the currently employment of graduates in their 40s will tell current students a very great deal about what they can expect in 20 years time, but the question of the uses and limitations of long-term career data is one that is very interesting and one we ought to be having.
Thanks for you detailed comments. I take the point about the 1990s cohort studies. Perhaps I should have made clear that I think the situation has changed with the passing of the SBEE Act and that in order to be any use any new pilot would have to work with HMRC to see if we can join up tax and institutional data in an upwardly scalable and cost-effective way (clearly LDLHE is too small anyway, and HMRC tracking, especially given the future importance and long term implications of student loan repayments, in the future should make it easier to track graduates who have stayed in-country).
Similarly SOC is owned by ONS, which is notionally independent of government, and if we need greater clarity and differentiation this should form part of the process. With the possible abolition of the census there’s an opportunity to have a more dynamic coding system that doesn’t have to stay set in stone for aeons before being changed.
I think the point below about it being quite likely that arts graduates are fundamentally different to science graduates in terms of career paths is quite important – it may be that the current DLHE is inherently biased and really only usable for the latter.
Paul,
I agree with you on much of that, and you know that I feel strongly that as a sector we have not always appreciated DLHE enough or used it as effectively as we ought. There is a lot of value in the dataset and a lot of uses it can be put to within and outside the sector.
The place where you take issue with me – I offer two observations. The first is that I was on the last DLHE Review, and the review was composed almost solely of careers professionals. I think that had we had been aware that KIS and associated outputs (league tables especially) would have the impact it had, we might have addressed certain issues differently and welcomed the perspectives of stakeholder groups who, shall we say, came a little later to the DLHE party. This is our opportunity to do that.
The second is a little more personal. I don’t agree with some criticisms of DLHE data, but have been swayed of late by, amongst others, the perspective offered by sector leaders who are partly judged on this data and have to be confident in it. Part of the issue is, as you point out, some high-profile manipulation of the survey data.
Goodhart’s Law – “When a measure becomes a target, it ceases to be a good measure” – is often apposite in HE and I think now applies to DLHE. So regardless of our own views we need to be seen to be examining and overhauling outcomes data.
Hi Charlie
Fun times! I was at the HESA’s first strategic agenda setting meeting for this and the international issue was brought up a few times. Possibly universities are going to be left to their own devices when it comes to measuring international student success (not that salaries are remotely comparable in the first place). And I’d go further than salary not being a good measure of careers success – the idea of ‘grad/non-grad’, a tail which has been wagging the statistical careers dog for for too long, is an equally dodgy metric…
Hi Michael
I think the feeling is that the semi-voluntary basis on which current international DLHE data is collected has potential if we can get more uniformity of response rates. It can take a lot of resource to collect international data and unless we suddenly acquire a lot of money from somewhere to do the work (you don’t know of any rich benefactors with an inordinate interest in international student outcomes?), we’ll have to carry on with the current level of resource.
I may have more to say on the grad/non-grad job question in the reasonably near future….