It’s hard to go to an HE event these days without a doomful prediction that the government is going to use the data at its disposal to shake up the university sector.
It’s easy to see where these fears come from: TEF is primarily a data exercise (for all the rolling back there has been since the original plans were issued). There’s fear that the government’s continuing crack-down on international student visas will result in yet another data-based evaluation from which visas might only be permitted at ‘acceptable’ institutions, or that it is only those programme where students make a demonstrable economic contribution which will be permitted to recruit from overseas. And, domestically, the Longitudinal Education Outcome (LEO) data – due for release after the election – will unleash new and detailed data about students’ long-term earnings for all to see.
And, for anyone whose memory of higher education policy stretches back not very far, to the early days of the Coalition government, we have this prediction from then-minister David Willetts who said, “I expect that, in the future, as the data accrue, the policy debate will be about the RAB charge for individual institutions.” The data is already being accrued, and it’s possible to know with greater some accuracy the RAB charge (the portion of student loans which go unpaid) at a pretty granular level. It’s time for the sector to take seriously the imminent shift to quantitative evaluation at every turn.
Why should I care?
Well, it depends. If you believe that sunlight is the best disinfectant then the more data out there the better. But perhaps that also presumes that the data will be nuanced, caveated and presented in a useful and digestible form. Experience from the last LEO data release in December suggests that this is far from likely. With the new sources of data, there will need to be careful analyses available to explain the implications to commentators, journalists and the public.
There are also fears around data security and the future misuse of data which had originally been collected for benign purposes: where data can identify individuals by religion or ethnicity, does that put them at risk from the malign actions of a government with more sinister interests? And then there’s the thorny matter of data security, as last Friday’s attack on the NHS highlighted all too well.
But won’t the people in charge know what to do with the data?
You’d hope that. This also presumes that the people running the show (UKVI, etc.) have accurate information. The headline numbers for student migrants come from the International Passenger Survey (IPS). This is flawed, the Office for National Statistics knows that, and there are attempts to improve it. This is potentially good and bad for higher education. On the one hand, moving to exit check data (based on everyone leaving the country and cross-checking with what visa they had), rather than a partial survey, should give accurate information about the overall number of overstayers (which is believed to be much lower than IPS currently shows).
On the other hand, the ONS has made proposals to link to HESA data – the implication being not just the aggregate numbers but tracking individual students. Amongst the ONS proposals for improving its data collection is the following:
“Link HESA data to HMRC data. This will tell us how many students are working in the UK during or after their studies. Investigate what linking Home Office exit check data to HESA will show for non-EU students, how many depart after their courses and how this relates to visa durations.”
The proposal could then show, for individual programmes and institutions who stayed, who didn’t, and what those who stayed ended up earning. And that could be used to further restrict the incoming flow of international students.
Why would that be a bad thing?
Universities are concerned about being held responsible for solely instrumental outcomes (i.e. how much students go on to earn). If visas to study were only permitted where universities have graduates in particular industries, or earning particular amounts, then that could radically change approaches to student recruitment. What if universities were fined for their visa over-stayers?
And it’s a similar story on LEO for domestic undergraduates. Universities are concerned that if a high proportion of their students never pay back their student loans (the RAB charge, as indicated by earnings data) then a new regime could see that institution denied access or given only limited access to student finance. Given that students who start out with high levels of social capital go on to earn more, might universities respond to the data challenge by rolling back the efforts to widen participation?
When there’s a game to be played, universities will find ways of playing it.
Where’s this all going?
With the potential to link together HMRC tax data, Home Office visa systems, Student Loan Company repayment information, and individual students’ HESA records, there’s a whole suite of things this current government, future policy makers, and academic researchers can do to influence and shape UK higher education. This is in addition to National Student Survey results, Destination of Leavers surveys, non-continuation data and so on.
There will be more data combinations, like it or not. The key question is what policy makers will choose to do with them. At worst, that might be restricting visas, sanctioning institutions, and monitoring individual students.
But let’s not assume that linking data must all be for the worse. Rather, higher education institutions and policy makers should engage in a more open debate about the risks and opportunities. This includes questions about how individual higher education institutions will go about developing the expertise to better understand and manage data. It’s not good enough for there to be a looming fear in the sector; we should have an open forum for debate about the detail of data, and the best ways to use it.