The sector level dashboards that cover student characteristics have a provider-level parallel – the access and participation dashboards do not have a regulatory role but are provided as evidence to support institutions develop access and participation plans.
Though much A&P activity is pre-determined – the current system pretty much insists that universities work with schools locally and address stuff highlighted in the national Equality of Outcomes Risk Register (EORR). It’s a cheeky John Blake way of embedding a national agenda into what are meant to be provider level plans (that, technically, unlock the ability to charge fees up to the higher level) but it could also be argued that provider specific work (particularly on participation measures rather than access) has been underexamined.
The A&P dashboards are a way to focus attention on what may end up being institutionally bound problems – the kinds of things that providers can fix, and quickly, rather than the socio-economic learning revolution end of things that requires a radicalised cadre of hardened activists to lead and inspire the proletariat, or something.
We certainly don’t get any detailed mappings between numeric targets declared in individual plans and the data – although my colleague Jim did have a go at that a while ago. Instead this is just the raw information for you to examine, hopefully in an easier to use and speedier fashion than the official version (which requires a user guide, no less)
Fun with indicators
There are four dashboards here, covering most of what OfS presents in the mega-board. Most of what I’ve done examines four year aggregations rather than individual years (though there is a timeseries at provider level), I’ve just opted for the 95 per cent confidence interval to show the significance of indicator values, and there’s a few other minor pieces that I’ve not bothered with or set a sensible default on.
I know that nobody reads this for data dashboard design tips, but for me a series of simpler dashboards are far more useful to the average reader than a single behemoth that can do anything – and the way HESA presents (in the main) very simple tables or plain charts to illustrate variations across the sector represents to me a gold standard for provider level data. OfS is a provider of official statistics, and as such is well aware that section V3.1 of the code of practice requires that:
Statistics, data and explanatory material should be relevant and presented in a clear, unambiguous way that supports and promotes use by all types of users
And I don’t think we are quite there yet with what we have, while the simple release of a series of flat tables might get us closer
If you like it you should have put a confidence interval on it
To start with, here is a tool for constructing ranked displays of providers against a single metric – here defined as a life cycle stage (access, continuation, completion, attainment, progression) expressed as a percentage of successful achievements for a given subgroup.
Choose your split indicator type on the top right, and the actual indicator on the top right – select the life cycle stage on the box in the middle, and set mode and level (note certain splits and stages may only be available for certain modes and levels). You can highlight a provider of interest using the box on the bottom right, and also find an overall sector average by searching on “*”. The colours show provider group, and the arrows are upper and lower confidence bounds at the standard 95 per cent level.
You’ll note that some of the indicators show intersections – with versions of multiple indicators shown together. This allows you to look at, say, white students from a more deprived background. The denominator in the tool tip is the number students in that population, not the number of students where data is available.
I’ve also done a version allowing you to look at all single indicators at a provider level – which might help you to spot particular outliers that may need further analysis. Here, each mark is a split indicator (just the useful ones, I’ve omitted stuff like “POLAR quintiles 1,2,4, and 5” which is really only worth bothering with for gap analysis), you can select provider, mode, and level at the top and highlight a split group (eg “Age (broad)”) or split (eg “Mature aged 21 and over”).
Note here that access refers to the proportion of all entrants from a given sub-group, so even though I’ve shown it on the same axis for the sake of space it shows a slightly different thing – the other lifecycle stages relate to a success (be that in continuation, progression or whatever) based on how OfS defines “success”.
Oops upside your head
As you’ve probably spotted from the first section, to really get things out of this data you need to compare splits with other relevant splits. We are talking, then, about gaps – on any of the lifecycle stages – between two groups of students. The classic example is the attainment gap between white and Black students, but you can have all kinds of gaps.
This first one is across a single provider, and for the four lifecycle stages (this time, we don’t get access) you can select your indicator type and two indicators to get the gap between them (mode, and level, are at the bottom of the screen). When you set your two split, the largest or most common group tends to be on indicator 1 – that’s just the way the data is designed.
As a quick context you can look for “*” again on the provider name filter to get sector averages, but I’ve also built a sector ranking to help you put your performance in context with similar providers.
This is like a cross between the single ranking and the provider-level gaps analysis – you just need to set the two splits in the same way.
Sign o’ the times
The four year aggregates are handy for most applications, but as you being to drill in you are going to start wondering about individual years – are things getting gradually worse or gradually better? Here I’ve plotted all the individual year data we get – which is, of course, different for each lifecycle stage (because of when data becomes available). This is at a provider level (filter on the top right) and I’ve included confidence intervals at 95 per cent in a lighter colour.