Gibbet Hill Road, a thoroughfare that roughly runs through the middle of the main campus of the University of Warwick, has a peculiar feature.
If you stand outside Warwick Business School, and walk across Gibbet Hill Road to the Department of Physics building, you have passed from the county of Warwickshire to the unitary authority of Coventry. No other UK campus contains a county boundary, something that is bound to come up in the Christmas quiz.
The road and the boundary bisect the campus into two roughly equal halves – with the notable side effect that a student with a room in the Rootes hall of residence lives in a different local authority to a peer living across the road in Cryfield.
How Warwick’s accommodation team could swing the next general election (possibly)
This has a notable impact on UK migration figures, to the extent that the Office for National Statistics makes a regular assessment of risk concerning the quality and availability of data from the University of Warwick accommodation service. The last one was in 2019 – if this is in your Christmas quiz, it’s probably me that is running it.
The Population Estimates Unit at ONS uses data from the accommodation team to make adjustments to the assessment of internal and external migration. The data itself isn’t much – just a note of how many bed spaces are in Coventry and how many are in Warwickshire – but it moves around 300 19-year olds each year out of Coventry statistics (the campus has a Coventry postcode) and into Warwickshire.
So, shift an extra 300 international students (let’s say) from one to the other. One county has a sharp year-on-year increase in immigration – a national statistic that will be hurled a voters from the pamphlets and hustings by parties looking to make political hay with the issue. Would this sway voters in Warwick and Leamington in sufficient numbers to unseat Matt Western? Would this be a bellwether seat that shifts the election decisively? In all honesty, probably not this time, but it is worth remembering how even the smallest data collection can have a big impact.
Sector data in national statistics
What I’m attempting to illustrate here is the outsize impact that student data has, a long way from the niceties of regulation and funding. This peculiar micro-return from one provider is an extreme example, but it is far from the only contribution the sector makes to the work of the ONS.
How does ONS know how old students are? And what sex they are? And where they come from? It gets that data from the annual HESA Student return, which itself plays a huge role in migration statistics. University students are among the most likely groups of people to move between local authorities, and to move into the country for what is described as “long term” (longer than 12 months) residency. So HESA Student data – not just from Warwick, from everywhere – is a key facet of politically volatile national statistics on immigration.
The ONS knows that HESA data is reliable, within known parameters. It does get a little annoyed that the data is lagged by up to 17 months – so it has a semi-secret agreement with Jisc (the old HESA bit) to provide student data in December rather than (when the rest of us get it) in January. And it is greatly looking forward to Data Futures – which, we are told, “will greatly increase the timeliness of the data, which will improve the accuracy and quality of the overall estimates”.
The ONS also knows that HESA data is reliable because:
Data supplied to Jisc are subject to an extensive quality assurance process with a range of automated validation checks that are applied to all submissions. Providers first validate the data themselves, as explained on HESA’s Validation overview webpage and then Jisc puts the data through quality rules, as shown on HESA’s Quality Rules Directory webpage. If the data fail this check, they are returned to the university to be corrected.
It also rates the usefulness data using the Quantitative Quality Indicators approach. Here, we learn that the data has been of even quality since 2016 – with known issues (the high level of implausible postcodes is one) being identified and thus compensated for.
Into the future
So, when the Office for Students says things like:
The challenges experienced with the implementation of Data Futures create additional risks to the quality of data delivered by Jisc at the end of the process, and we were already planning on the basis that material changes to data collection systems can result in a reduction in the data quality that can be achieved.
you had better believe that people far beyond the higher education sector sit up and take notice. Issues with student data have a knock-on effect on national population data, which has a knock on effect on migration data. And at that politically sharp end of the process mere caveats or “explanations of known weaknesses” are not going to cut it.
Other customers – local authorities, or the NHS, for instance – will also struggle with data of a lower quality this time round. This will have a knock on impact on the provision of all kinds of services and support, both for students and for the wider population.
And when the Office for Students says things like:
We are pausing implementation until we have completed an independent review of the current issues with the delivery of the Data Futures programme and can be confident that the move to in-year data collection can be achieved effectively for providers and the OfS.
be aware that this has an appreciable impact on the quality of immigration data – and that cells in Home Office risk monitoring spreadsheets are flicking from amber to red. And that, even if the regulator and the Department for Education is broadly content with the status quo and an acceptable level of lag and data issues, the Home Office and the data demands it can make sit further up the Whitehall hierarchy – especially during an election year.
How it feels for vice chancellors
To the extent that institutional leaders are thinking about the issues with Data Futures at all, they are thinking about the impact on the use of data in regulation, or – if they are wise – the impact of the failing system on the staff that have to deal with it. The Office for Students concerns, thus far, are about the use of HESA Student data (and the other datasets that will be delayed thanks to the Data Futures problems) in regulations.
For an innovation that was meant to have at least half an eye on reducing data burden (remember when we used to talk about that?) the fact that it has reached the vice chancellor’s office at all should be a bit of a red flag. There’s been issues with the way OfS have managed Data Futures (changes, delays, and restarts too frequent to count as regulatory demands has changed), and the way HESA (via the initial implementation and now the Jisc-led approach) has overseen the technical end of things – but the time for blame has long passed. It is now time for action – beyond the pause in roll out and the full review of the programme.
When the Office for Students became a provider of official statistics back in 2018, it signed up to the Code of Practice for Statistics covering the trustworthiness, quality, and value of statistical work. HESA (and, therefore, Jisc) also holds this designation. Clause 3.1 notes:
Statistics should be produced to a level of quality that meets users’ needs. The strengths and limitations of the statistics and data should be considered in relation to different uses, and clearly explained alongside the statistics.
There are, of course, users and needs of varying importance. The people who produce immigration statistics are quite high up that list, and will make their displeasure felt in ways that your vice chancellor can only dream of. The quality of student data has an outsize impact on the way the sector is perceived in government.
OfS, HESA, and Jisc took a risk when they chose to move Data Futures implementation forward – any changes to data collection will yield a (small) initial dip in quality, but this will usually be made up for by improvements later in the process. Right at the moment, this year’s data looks like it will suffer from a serious drop in quality, while the improvements have been put on hold. At best there will be widespread disquiet – at worst speedy action could be taken that would transform data in the sector without the consent or oversight of those involved.
I’m sure you’re right about Warwick being the only one, but I always thought that the Sussex campus encompassed parts of Brighton and Hove Unitary and parts of Lewes District (in East Sussex) councils. Am I wrong (and if so, have I always been or is it a boundary changes thing)?
I also wondered, as Reading University’s campus is split between Reading and Wokingham Borough Councils.
In the author’s case, Coventry is in the West Midlands county (which isn’t noted, but ‘Council’ is which causes the query) whilst Warwickshire is obviously Not.
Your example are both East Sussex County, with a council spilt, like mine.
So his pub quiz question survives another day.
Fascinating stuff. All depends where you define the campus as starting and ending I think! For me Warwick is the only one that needs an ONS statistical work-around so I think that puts it on a level above Reading and Sussex.
Sounds like there’s a new niche rankings subset for over-enthusiastic marketing departments.
So checking this out with OS maps can confirm that, with the exception of the sports pavilion, the UoS campus is entirely in Brighton and Hove.
Reading has a stronger case, in that the Henley Business school is split between Wokingham and Reading (the Debating Club is entirely in Reading, giving it an unparalleled chance to take up the war on woke). I foolishly though Berkshire was the top level authority in the area, but both Wokingham and Reading are unitary authorities!
Brighton and Hove is in the historic county of East Sussex, but not the administrative one.
And in historic county terms, Coventry *is* in Warwickshire.
So the pub quiz question gets more complex … (although possibly redundant, in the case of Sussex)
We’ve certainly found this to be the case with our members, which is exactly why a membership model in which multiple providers share evaluation expertise works well. Great to see the shortage of staff with relevant data expertise being given the attention it deserves (although not so great the issue exists). I’ve lost count of the number of providers I’ve seen struggling to fill posts with a data/evaluation focus.