When international applicants have their heart set on the UK and are choosing between our idyllic towns and cities, I’ll bet they’d be keen to know the likelihood of them dropping out or getting “good honours” – especially in comparison to UK domiciled colleagues.
Both factors would tell them something about the nature of the education and support on offer – so they might be surprised to learn that there’s a national undergraduate attainment gap (good honours) of over 10% points.
It’s your letters
When our caretaker Secretary of State wrote to the Office for Students in September, you may recall that he included something of a sting in the tail for a sector that was still busy popping corks at the news that the Government will reintroduce post-study work visas for international students.
“We will not be able to rest on our laurels”, he opined, and “it is critical that international students receive a world-class experience.” His direction to OfS was that he would like it consider steps to ensure international students “feel integrated on campus; are supported in terms of their mental health and wellbeing; and [that they] receive the employability skills they need and are supported into employment, whether in their home country or the UK.”
And there was more. It will, he said, be “critical to ensure the OfS makes public transparent data on the outcomes achieved by international students, including those studying wholly outside the UK, such as it does for domestic students.” And not only that, “such data should also inform the approach the OfS takes to setting and monitoring compliance with its quality requirements.” So we thought we’d have a think about the data that’s out there, and what’s currently conspicuous by its absence.
First let’s have a think about the Teaching Excellence and Student Outcomes Framework (TEF). Its basket of metrics comes in for criticism from all sorts of quarters, but in this context one of the more curious features is that it pretends that it represents an outcomes assessment for all students at a given provider. That’s true for the National Student Survey satisfaction metrics – but absolutely not true for continuation and employment outcomes where only UK domicile students are in the basket.
Collecting the latter would be fraught with difficulty and would suffer even more acutely from the benchmarking and context issues that LEO does. For the former, OfS’ reasoning is that
Non-continuation data we have published previously is within the performance indicators which benchmark using entry qualifications. We do not have the level of detail on entry qualifications we would need to benchmark and even if we did it would be questionable whether we would have sufficient students with each set of qualifications to meaningfully benchmark.
…although intriguingly it adds that:
We don’t currently have any plans to publish provider level data although we do include international students when we consider provider performance at registration as this uses absolute values ie it is not benchmarked”
This is probably news to most people (largely because OfS has thus far refused to reveal the Colonel’s secret list of 11 herbs and spices it uses to judge whether someone can pop onto the register or get a condition) but it does raise all sorts of issues. Do international applicants gawping at a TEF medal being thrust in their face by an international recruiter know that the judgement is faulty when it comes to them? We certainly couldn’t find a national body attempting to explain the TEF that makes clear that international students are missing from most of the metrics.
And it’s not just about the TEF. Visitors to Discover Uni (which in its own words “includes data collected from universities and colleges about all their students”) would never know that the non-continuation data that pops up in whizzy bar charts for every course (that isn’t benchmarked) ignores internationals. Surely international applicants need the information? And at a sector level, it looks like the last time we got data out of HEFCE was in 2012, but that’s now seven years old and could doubtless vary hugely by provider, programme and domicile.
The attainment gap
Then there’s the issue of attainment. Here we can find some data – although it’s not as if it’s easy for international applicants to access. My colleague David Kernohan has kindly plotted the percentage point difference in 1st and 2:1 degrees between international and UK domiciled graduates, and as a “Yes, but does it correlate?” bonus the numerical difference between international and UK domiciled graduates.
At a sector level the attainment gap is just over 10% – not quite the 13.2% gap faced by UK domiciled BAME students, but not far off – and that average hides some big extremes. Negative gaps aren’t really justifiable – if entry standards are as robust and consistent as universities say they are, and the right support is in place for international students when here, there really ought to be no gap. And applicants really ought to know if there’s a big gap because of what it suggests about a provider’s standards of support.
There could be all sorts of reasons for gaps, and domicile and subject mix will play a part. You may have some of your own cod theories too about social capital or mental health or currency fluctuation. But if the story of the BAME attainment gap tells us anything, it’s that after the cod theory stage, to make a proper difference we first need rich, nationally published data – then decent research that gets at the lived experience – and then some pilot projects and some solutions. It feels like we’re some distance from that right now.
And the rest
As I noted above, employment outcomes are going to be fiendishly hard to measure, let alone benchmark – and TNE will also have similar benchmarking problems. As for Gavin’s other exhortations – “feeling integrated on campus” is kind of related to NSS Question 21 (although OfS’ doesn’t publish national home/EU/non-EU splits for NSS); “feeling supported in terms of their mental health and wellbeing” was sort of a pilot-postgraduate NSS question but isn’t out there yet; and “receive the employability skills they need” sounds like asking OfS to regulate on outputs rather than outcomes, which it will both resist in principle and fail at measuring if it tried in practice.
If you take the view that OfS is a legitimate body designed to regulate and intervene in a market, it’s certainly true to say that the international market has distinct features, and needs a dose of dedicated regulation. Even those opposed to the “marketisation” of the higher education sector don’t pretend that we’ll be scrapping international students’ fees any time soon, or introducing international number caps. And the squeamishness at the notion of “market regulation” around the nations also feels dangerously home domiciled in its thinking. Williamson also referenced the regulator’s work on “harmful student recruitment practices” – and OfS is, after all, not called the “Office for UK domiciled Students”.
So what’s the issue here? Some data on international students – employment data in particular – is hard to collect reliably. International students are more likely to leave the UK to work than other students, not least because of our ridiculous “hostile environment”. Data on continuation, and the NSS, is collected for international students – but is not routinely presented to us in a disaggregated manner.
What data we do get is troubling. A 10% attainment gap has nothing to do with the bright images applicants are sold, and the fact that we even had to ask HESA for this data to find out is testament to the lack of attention international students get in policy. If we want to put the long held suspicions that international students are simply “cash cows” to rest, Williamson’s call for ”public transparent data on the outcomes achieved by international students” may be the smartest ministerial intervention we’ve seen for a while.
3 responses to “Gavin Williamson is right on international students”
This presents a rather negative slant on UK higher education. The USA is the #1 destination for international students and NAFSA: Association of International Educators have consistently reported that international student’ retention rates in relation to US students varies greatly between universities. Although reasons identified by NAFSA for low retention included finances stresses, academic progress difficulties, English-language problems, a further significant factor was a desire to attend an institution considered a “better fit” (trading up, in other words).
Jim Dickinson’s blog provides no comparative data at all and so it is impossible to assess whether UK HEI’s record is better, worse or the same as other bench-marked universities.
Confining attention to the UK comparison, the blog also essentially compares apples and pears. The UK graduate population consists of completely different mixes of students in terms of funding, prior attainment, part-time/full-time, continuers or movers between HEIs. Very likely the subjects being studied also differ and probably there are also significant demographic differences.
The WONKHE tableau graphic controls for none of this. While a caveat is thrown in (“There could be all sorts of reasons for gaps, and domicile and subject mix will play a part” – quite!) that certainly doesn’t prevent the author from jumping to conclusions: a 10% attainment gap, descriptions of international students as cash cows, the assertion that “Negative gaps aren’t really justifiable – if entry standards are as robust and consistent as universities say they are, and the right support is in place for international students when here, there really ought to be no gap.”
To support these conclusions, it would need to be shown that after controlling for relevant variables, the proportion of non-UK students no longer in HEI anywhere 12 months after entry was lower than for UK students.
The Tableau shows a good honours attainment gap by provider and the 10% refers to the same. How is that justifiable?
Contrary to the opinion that “International employment outcomes are fiendishly hard to measure”; that we should “make do” with a 25% Graduate Outcomes response rate & woefully low Chinese sample. With a more innovative methodology, Asia Careers Group has been measuring & benchmarking International Employment outcomes since 2016, for over 42,000 individuals in Asian markets, weighted according to UK market share (i.e. 11,000 Chinese Students). ACG tracks career progression over time & average income. ACG data is being used by forward-looking UK HEIs to improve & enhance student experience & Graduate Outcomes. Subscribers include: ARU; Aston; Coventry; DMU; Durham; Exeter; King’s; NTU & Sheffield. These institutions should be commended for investing in their International Graduate Outcomes in advance of the rest of the sector.