The introduction of the Small Business, Enterprise and Employment Act, as one of the final pieces of legislation of the last Parliament, paved the way for educational data to be linked to data from HMRC to examine the outcomes of leavers from education.
The Green Paper then stated that employment and earnings information derived from this linked data would be used as part of the common metrics that would feed into TEF for the purpose of informing applications for higher student tuition fees.
All of this has led to speculation that the venerable survey of graduate first destinations, which has been conducted by the sector for over 50 years and which is currently incarnated as DLHE, might be dropped. HESA’s announcement that outcomes data are under review has therefore excited a certain amount of comment. Why review something we no longer need? What is happening?
What the Small Business, Enterprise and Employment Act allows is the linkage of educational data to tax records, which means that, potentially, people allowed to access tax records can track someone’s educational journey through school, into HE, and then into the labour market and get accurate earnings information through their subsequent career. As the provisions state, this will be used to produce data on the returns to study of individual subjects and at individual institutions.
But there are good reasons that this isn’t the DLHE-killer some people have painted it as. One reason is data coverage. HMRC data will give us excellent information about salaries, and when linked to elements of student record data, this will potentially allow a vivid picture to be built of earnings by all kinds of characteristics – not merely institution and subject, but gender, ethnicity and disability status, amongst others. Previous educational experiences can be brought into the mix. But it’s not clear how good the information on other aspects of employment outcomes will actually be. Industry and occupation data are not guaranteed to be of the quality of the data collected through DLHE, and employment location may only be limited to head office data. Many very good objections have been raised about the use of salary data as a metric for university performance and salary data alone simply isn’t enough for the many stakeholders in university outcomes data. We need more and better information.
But there are more subtle reasons why it is advisable for the sector to continue to collect outcomes data. The first is the question of ownership. HMRC data is, obviously, not under the ownership of the HE sector. Indeed, such are the restrictions on how the data might be used it is possible – even likely – that nobody in higher education will ever be able to access the full raw data. We may be able to obtain extracts, partial datasets or processed data – the details are still to be worked out – but anyone dreaming of using the full data will probably have to moderate their ambition. On one level, this is not an issue – indeed, it may be of value for HE to be assessed using data that nobody can accuse institutions of manipulating in any way. But data that is not administered or controlled by the sector cannot be modified or adapted to suit changing needs. Our own outcomes survey allows us to ask questions at a national level that can then be used by the sector as a whole or by individual institutions that may not be directly related to earnings.
The second issue is more subtle. DLHE data is collected with the informed consent of the participants. The same is not true of linked HMRC data, collected, as it is, by Act of Parliament. We can’t very well ask for students to actively consent to their tax data being used by institutions – I am no expert, but I can’t help feeling it would be a pretty tough sell – and so data that comes from consenting participants may still be valuable.
DLHE has certainly come in for a good deal of scrutiny of late. It was reviewed relatively recently, in 2010, but the subsequent rapid expansion of public metrics has led to the data no longer meeting the needs of a much-expanded group of significant stakeholders. The 2010 Review largely drew upon the needs and experiences of the careers and employability community who were (and in many cases still are) responsible for collection and were the largest consumers of the output of the survey. Careers professionals remain an important stakeholder group and one whose experience of collecting and using outcomes data should not be sidelined, but this information is also crucial to other sector stakeholders and any data must meet more diverse needs.
And this is our opportunity. A space remains for a sector-owned, population level outcomes survey, collected from an audience with their knowledge and consent, which can address the current sector needs. The most troublesome aspect of DLHE – the collection of salary data – is no longer vital. Everything else, from the reference date (we will probably move away from six months) to the sample coverage, is up for grabs. Shall we keep a focus on employment outcomes? Should we take a more qualitative view? Can we use a survey of this nature to examine more fundamental questions about the individual’s perception of the value of higher education?
2016 will almost certainly see the last DLHE as we know it, but the long story of UK higher education outcomes data is not yet over.