I saw some bluebells on a recent walk with my dog, as sure a sign of spring setting in as the HESA UK performance indicators.
So ignore storms Ciara and Dennis, the biting cold, and a chaotic reshuffle – let’s have a look at the widening participation indicators for 2018-19.
The what?
The UK performance indicators have been published by HESA since 2003 as a way of making standardised comparisons between higher education providers – overseen by the near-mythical (last met in 2017!) performance indicator steering group. There’s currently two active sets – widening participation and non-continuation. Graduate employment is still technically live, but we need some “graduate outcomes” data to put in it – the 2017-18 data due out later in the spring based on the first of this data will be an experimental release for this reason.
With UKPIs we are only concerned with undergraduate students from England, Scotland, Wales, and Northern Ireland – there’s none of the postgrads or channel islands that crop up in other student data here. Entertainingly, the low participation element of the widening participation UK performance indicators is England only.
UKPIs revolve around benchmarks – a way of generating a sector average that looks just like a given institution’s student population in terms of background and subject spread. Though the benchmarks are not officially targets, the standard error is used to create the flags that we all grew to love in the Teaching Excellence Framework data. Many providers use the benchmarks as internal targets, even though HESA is characteristically coy about this practice.
Today’s main course
You’ll recall these stats were experimental last year following the inclusion of alternate providers. They’re back to being “official” now, meaning that we don’t really have much of a time series. We get to look at three main aspects:
- Low participation areas
- State schools
- Disability allowance
Of these, the third category is my least favourite – as it records receipt of disabled student allowance rather than the better known self-reported measure. I think the latter is better for understanding the lived student experience, and I’m always going to prefer student voices to administrative data.
So focusing on the other two, here’s a couple of graphs on low participation areas:
The first tab (“data”) shows the location adjusted benchmark (which includes the government region of domicile for students) against the percentage of students from low participation areas. You can filter by group, region of provider, and OfS registration status as of last time I checked it. The “+/-” filter lets you see the significance flags – remember
The second tab (“difference”) shows the difference in percentage points between the location-adjusted benchmark and the actual percentage as a ranking. The filters are available as above. Staffordshire, Suffolk, and Canterbury Christchurch head the table – at the other end the Royal Northern College of Music, the Royal Agricultural University and Bradford have significantly underperformed.
Just to be clear – tables you see elsewhere will probably show the regular benchmark – I’ve plotted the location-adjusted one as I think it is a fairer way of looking at institutional performance.
The state of schools
“What’s a state school?” is one of those questions that get more complex the more you talk about it. The official definition is:
“All schools or colleges that are not denoted ‘independent’ are assumed to be state schools. This means that students from sixth-form or further education colleges, for example, are included as being from state schools. All schools in Northern Ireland are also treated as state schools.”
It’s not perfect because it only looks at the last provider attended – so someone who did up to GCSE level at Eton and then went to a state sixth-form for A levels officially went to a “state school” in these figures. And that definition is worryingly negative. There are many schools now (especially offering special educational needs places) that don’t sit comfortably in any category. But it’s what we have.
There’s two tabs as with the low participation visualisation, and the filters work in the same way. Again I’ve used the location adjusted benchmarks. It’s great to see the New College of the Humanities significantly outperforming expectations – Queen Mary, Kent, and Writtle are up there too. At the other end the Royal College and Royal Academy of Music struggle on this measure – with Oxford Brookes a notable low performer again.
Bonus beats
Ever wondered which subjects attract the greatest proportion of private school applicants? Wonder no more, it’s in one of the supplementary tables. Here’s a visualisation:
It’s medicine, history and philosophy, and languages. I’ve added in the means to look at low-participation areas as well, and to filter by entry qualifications. You need to be careful not to over-interpret subject areas with low student numbers (looking at you, combined studies) – there’s numbers in the pop-ups to help with this.
Disability allowance is my least favorite too – some students with disabilities actively do not want to claim it