The Home Office reviews English language assessment

But it will be how it uses the findings that matters

Michael Salmon is News Editor at Wonkhe

Cast your mind back to the MAC review of the graduate route in May 2024 – the most detailed government response, buried in post-general election taking out the trash, included the note that the then government would “review with the sector how they assess English Language standards for international students, with the objective of standardising independent assessment.” This was despite the MAC review having had nothing to do with this particular aspect of international student policy.

(The same letter also promised to apply tougher sponsorship standards to student sponsors – under Labour this morphed into the BCA reforms in the immigration white paper.)

Later that year there was word that the Home Office was particularly interested in English language self-assessment arrangements, in part it seemed driven by pretty anecdotal evidence (rather than anything more systematic). This is the ability of higher education providers with appropriate sponsor status to assess students’ English levels themselves, or allow the use of tests which are not on the Home Office “secure” list – the so-called SELTs, including IELTS, Pearson PTE, and a small number of others.

A survey was conducted last spring, and the results are now out. What we don’t get is much context – such as whether the government still sees the need for more “standardisation.” The language has been somewhat softened to an intention to “improve understanding of the processes that are currently being followed.”

While essentially all higher education institutions that responded (some 144) predictably continued to accept various SELTs, 92 per cent accept some other tests as well, in particular TOEFL and Cambridge English. And the results make clear that almost all institutions review the validity (97 per cent) and test security (92 per cent) of the non-SELTs they use (indeed, one of the most interesting things from the recent high-profile case of a test provider being fined for issues with testing was how quickly universities had begun ringing the alarm).

On the subject of test integrity, given the numbers of international students involved there are no particular red flags:

A small proportion of HEPs (12%) reported not encountering any instances of fraud, while most were likely to have experienced fraud either rarely, very rarely, or almost never (71%). Around one-sixth (16%) had experienced fraud sometimes, with a minority (2%) saying fraud occurred often at their institution.

Most institutions also have post-enrolment systems in place for dealing with enrolled students seen to be struggling, though this does then invite the question of how effective these are proving, which the survey doesn’t get into.

There are plenty more findings, and the results contribute in a useful way to an area where research is relatively scarce. What there isn’t is an obvious chink in the armour that would give impetus to any particular reform – a lack of ID checks, say, or an absence of fraud prevention training with admissions staff (it’s all there).

This all frustrates attempts to use the issue of universities’ relative autonomy in this space as a stick to beat the sector with, as Policy Exchange sought to do in a report last year. The think tank’s other recommendation was about raising the level of English which is required for university entry, which isn’t something that this kind of review can really speak to – but there’s always the potential for this issue to rear its head again, you feel.

Only about of third of respondents did their own in-house English testing, though this is somewhat hard to square with the fact that 78 per cent accepted pre-sessional English courses or pathway programmes, which presumably tend to use bespoke exams. We don’t get the data tables to check this, but the assumption would have to be that in-house testing is more often at larger universities with the resources to accommodate it. Financial pressures (and the potential for Home Office scrutiny, of which this review is a case in point) will likely continue to disincentivise this, which in my experience is an enormous shame as when done well it has the potential to be much better tailored to students’ needs than off-the-shelf equivalents.

Which external tests universities admit continues to be an issue of great commercial importance to the testing providers involved, which leads to the occasional entertaining war of words – but more importantly, makes the ongoing Home Office tender for a new secure English language test, which is now described as potentially “digital by default”, all the more important for the sector to be keeping an eye on.

As for what the Home Office will conclude, following its drive to “improve understanding” of the processes involved in English language testing in higher education, remains to be seen.

1 Comment
Oldest
Newest
Inline Feedbacks
View all comments
Jonathan Alltimes
2 months ago

Candidates should be tested in person and prove their identity at entry.

The English competency standards for admission should be raised.

For each English test, a degree-specific vocabulary test should be added for common key technical terms used in the admission qualifications.