This article is more than 7 years old

Questionable data and student immigration policy

At the heart of the never-ending fight between the Home Office and universities is a question over the reliability of the government's migration statistics and the International Passenger Survey. David Morris breaks down this complex debate.
This article is more than 7 years old

David Morris is the Vice Chancellor's policy adviser at the University of Greenwich and former Deputy Editor of Wonkhe. He writes in a personal capacity.

Why are students included in the net migration figures? This question has been asked, and asked again, by exasperated representatives of the higher education sector for a number of years, desperate to reduce universities’ exposure to Theresa May’s goal – first at the Home Office and now at No. 10 – of reducing net migration by any means necessary.

The argument seems simple enough: students are merely temporary migrants, with a staying period in most cases of no more than three or four years. Each year, those students exiting the country at the end of their courses should be – at least very approximately – equivalent to the new students entering to begin their studies. The public appears to agree: the most recent polling for Universities UK found that just 24% think of international students as immigrants. At the very least, argue the universities, students should be removed from the government’s political target to reduce net migration, if not the official figures themselves.

From the perspective of the Prime Minister and officials in the Home Office, the logic in universities’ argument for removing students from net migration statistics is faulty. If students entering and leaving the country merely ‘cancel each other out’, the net migration statistics, based upon the International Passenger Survey (IPS), should show this. Net migration statistics thus show the number of students staying past their course. As the Office for National Statistics states (pdf):

“Any more permanent impact of student migration on overall net migration figures therefore largely relates to the extent to which those coming to the UK to study stay beyond their course length for additional years.”

This is arguably a very understandable reason for continuing to keep students in the net migration figures.

In 2012 the IPS introduced a new question for those leaving the UK, asking respondents on the way out of the UK why they originally came. This allows an estimate of the number of people who come as students and who actually leave the country at their studies’ conclusion. It is vital to emphasise that the IPS – this most critically political set of statistics – is a sampling survey with many variables. All publicly announced figures on migration are statistical estimates, a nuance not often reported in the public realm.

The ‘overstayers’ conundrum

The 2015 IPS figures showed that 135,000 more students arrived in the UK than former students left. Yet as the University of Oxford’s Migration Observatory points out:

“This snapshot must be interpreted carefully, because the people arriving and leaving are part of different cohorts… Eventually, it will be possible based on IPS data to construct an estimate of what share of each year’s cohort of students leaves, but this is not currently possible because the relevant data only goes back to 2012. However, if the current number of student inflows and outflows remained stable at these levels for several years, it would suggest that a majority of students were not going home.”

Not that these caveats have made the Home Office quite so reserved in drawing conclusions from the IPS. Ministers have been happy to state in public that students are not leaving the country after completing their courses, either through legal means such as switching to a different visa, or otherwise illegally – the so called “vanishing” or “overstayer” students.

Successive clampdowns on international students have thus focused on ensuring that international students are “genuine” and – despite the trial of exemptions for some universities – have narrowed the available routes to obtaining alternative visas upon completing their studies, most infamously with the abolition of post-study work visas.

Yet it has gradually emerged over the past year that the IPS, is spite of attempted refinements, may not be reliable enough to give us an honest picture. Last Thursday, the ONS, as part of a wider review of national migration statistics, confirmed that it would be reviewing how the IPS counts the number of international students leaving the country upon completion of their studies:

“One of the significant current challenges is to understand what former international student immigrants do when they complete their studies… The IPS figures of international students immigrating to the UK are consistently higher than the IPS figures of former international students emigrating. We are working collaboratively with other government departments to investigate what other sources can tell us. This is a complex area which will require analysis of several datasets, drawing on the expertise of data providers across government. We will publish an article by mid-2017 giving a further update of progress in this area.”

What are you waiting for?

The recent reintroduction of more comprehensive exit checks at the border was partly an official recognition of the IPS’s imperfections. This was begun in 2015, and will finally enable the government to comprehensively understand international students’ exits.

Wonkhe understands that the ONS and Home Office have been aware of this issue for at least a year, as comparative analyses of the IPS alongside the new exit checks has led to calls for a review. The ONS’s revelation is hardly news in the public domain either. Back in October a Whitehall source told The Times that one revised piece of Home Office analysis showed that barely 1% of international students – equivalent to about 1,500 per year, and less than a tenth of the current official figures –  were ‘overstayers’. In September, a report from IPPR went into further detail about the shortcomings of the IPS, particularly around international students:

  • Respondents may not be able to correctly remember or specify precisely why they first came to the UK, particularly if they originally came to study but then started working
  • Former students may express an intention to return to the UK within a year – and thus not be classed as emigrants – even if they do not return after all.
  • Students may be counted as migrants as they enter the UK, but not counted as migrants when they leave, if they complete a course (usually postgraduate) in less than 12 months which was originally supposed to take a full year.

IPPR argued that although the IPS suggests that a very large number of students are overstaying (and therefore counting towards net migration), it is remarkable that such an influx has not been picked up by other data sources, including the Annual Population Survey, student visa compliance rates, and visa renewal refusal rates.

As it happens, the statistics released last Thursday already point towards the data’s possible unreliability in comparison to other available data. The IPS showed a 23% fall (41,000) in long-term immigration to the UK to study from September 2015 to 2016, though the number of visas issued over the same period to non-EU students for 12 months or more was up by 2%. The ONS state that “there have been differences in the trends between the IPS and visa data for study in the past, in particular around 2009 and 2010, so this is not unprecedented”, and that sampling errors or seasonal variations in the IPS might be the cause. Yet for such a significant gap to suddenly emerge between the two data sources for the first time in seven must induce some head scratching.

It is quite possible that a resolution to this most wonky of sagas will come in the next few months. The Home Office told The Times back in October that it would “assess and analyse” exit-check data “to understand the extent to which estimates provided are statistically robust”, and the ONS’s latest announcement would appear to give such work added impetus. I understand that it is only a matter of time before the government significantly revises its methods (and by extension its figures) for calculating student exits.

Data, politics and policy

One reason given for Theresa May’s opposition to taking students out of net migration is her belief that to do so would be seen as a “fix” by the public, and undermine faith in the government’s immigration policy. Yet if figures on students’ contributions to net migration are significantly revised, this would only have the same outcome. A revision could reshape the debate not only on international students, but on immigration widely – the most significant public policy issue of the day. This would not only be a wonky embarrassment. It would be a significant political embarrassment, particularly when the Prime Minister has several senior members of her cabinet who have disagreed with her stance on international students over the years, including the Chancellor and the Foreign Secretary.

It would be astonishing if a failure of data collection were to be the prime cause of one the great policy disasters of recent years: the government’s deliberate clampdown on universities ability to recruit international students. It would also underline just how crucial robust and reliable statistics are to vitally important government decisions, and why no statistics should go unquestioned.

One response to “Questionable data and student immigration policy

  1. An excellent analysis but the implications I think are greater still. If the ONS screwed up net migration figures so badly in the last few years, the implications for the result of the EU referendum don’t bear thinking about.

    Can the government afford to let that one out of the bag just as it triggers article 50?

Leave a Reply