And so it begins. January brings the sense that winter is drawing to a close, a new start, and an onslaught of fabulous HESA data – every week from now through to June.
I’m not going to promise to cover every release in depth, but hopefully for most I will be able to pull out a handful of key visualisations, possibly showing new aspects of well-known releases. And this week’s entry, the HESA Student Statistics (2017-18), is very much a case of HESA starting the set with the hits. This is one of the best-known datasets at a user friendly summary level. The detail will arrive on 30 January, but even this very top-level look at the size and state of the sector offers a few unusual insights.
Cross-border recruitment
What should intrigue us about this graph is the split between the four UK HE systems. In the main we see students study in their country of domicile in the UK, and there are a few English institutions that seem to specialise in attracting Northern Irish, Welsh, and (to a much lesser extent) Scottish students. The filters allow us to look across different levels and modes of study for each selected domicile. The highlighter is the easiest way to find a particular institution, and the colour shows the mission group.
Maps, though undeniably pretty, are often not as easy to read as other visualisations so here’s a look across the same data as a self-sorting plot. Here I’ve added “provider country” as a filter allowing you to see which English providers recruit the most Scottish (Newcastle), Welsh (UWE), and Northern Irish (Liverpool John Moores) students.
Subject of study
What are the trends in the subject students choose to study? It is pleasing to see a nice five-year time series by subject of study and mode level within the (excellent) open data presentation. I couldn’t resist plotting it for myself.
One thing to note is how much of the decline in part-time study is found in the subjects allied to medicine for women and business studies for men.
Grade inflation
Yesterday, the OfS, seemingly at random, released a comparison of the population for each of the three ways we now have for measuring grade inflation. For reasons I don’t entirely understand, the main OfS grade inflation measure, the TEF grade inflation supplementary metric, and what is calculated by HESA as a part of this release all use different assumptions.
More for completeness than anything, here’s the HESA method. You’ll see that there’s no issues with grade inflation for part-time students and that the supposed problem transcends national boundaries.