This article is more than 4 years old

We need to talk about access and participation data

An OfS "pink box" report makes David Kernohan wonder why the real questions about data in access and participation aren't being asked.
This article is more than 4 years old

David Kernohan is Deputy Editor of Wonkhe

The Office for Students’ “pink box” reports are very much for the connoisseur.

While the main release schedule is largely predictable, reports commissioned by the OfS from external consultants (with the famed box noting that they do not necessarily reflect OfS views or positions) permit a greater range of deviation from the OfS house style. It’s great that the regulator both commissions and publishes these reports – not least because it’s great to read good quality critiques of policy and regulation. Other than on Wonkhe.

But “Data use for access and participation in higher education” from CFE Research is an odd one even by “pink box” standards, and it takes a little while to spot the hole in it.

Dashboard dilemma

If you think about OfS access and participation data, chances are you’ll be thinking about this. It wasn’t popular, it wasn’t easy to work with, and it somehow features only in passing in the report – in lines like:

The OfS gaps data is positively perceived as it highlights how the provider is doing in relation to the overall patterns and gaps.”

and

The introduction of the A&P dataset has afforded higher-education providers a rich source of data, as well as focused attention on the whole student lifecycle. Providers are concerned about the timing of this data, the regularity of updates, and the availability of support to interpret the gaps data and/or recognise where progress is being made. Some would like simpler data based on a narrower set of measures, because complex data systems present challenges in providers with more limited data capacity.”

Though the second example is the one that rings true to anyone that has ever worked with the dataset, it’s the placement of the first example in the lengthy overview of the current data landscape that is more telling.

Let’s go back to regulatory notice 1 – just before paragraph 86 we learn that:

The OfS expects providers to use data, and reference the sources of data, in their plan, including the access and participation dataset provided to them by the OfS.”

Providers are welcome to use other sources, but the gaps identified in the data is expected to form the basis of the (mandatory for higher level fee charging providers) Access and Participation Plan. And OfS will be monitoring progress against this dataset.

So the A&P data dashboard has become the de facto source of truth for such activity in the English sector, and as such you would respect the usage (and issues with) the data to form a central part of this report. That it does not – providers are using UCAS data, HESA data, internal monitoring, and additional paid for data in both targeting and monitoring activities – suggests that the A&P planning process (and, by extension, the dataset) has become a significant additional burden to providers – far from a positive perception.

I know from conversations all over the sector that providers have struggled to make this data work for them – and while knowing what data they are using is useful, knowing why this new data isn’t working for them would perhaps be a little more help.

What do providers actually use

It is perhaps inevitable that the recommendations on targeting put the onus on the OfS to explain themselves a bit better. There’s a call for guidance and examples in using the provided indicators and measures, and how to safely draw inferences from them. And in the short term, OfS are asked to develop a series of case studies of effective practice in using the data, and to work with stakeholders to develop consistent definitions of A&P activities and standardised monitoring reports.

You’ll recall that fuss was kicked up regarding the lack of action on mature students – CFE Research hits the nail squarely on the head and asks OfS to get on and develop a suitable mature learner planning data set, collaborating with providers that work a lot with such students to do so. It’s also gently hinted that the old HEFCE AHE (Adults in Higher Education) measure might provide a useful basis for this.

So if the A&P dataset is – let’s be kind – not essential, what data are providers using to target the issues some groups of applicants face? Overwhelmingly it is in publically available area-based measures (POLAR, IMD) – 70 out of 76 respondents made use of these, and 66 listed them in their top three. UCAS data, internal data (from enrolment or surveys) and published school and college level (DfE) data are the only ones on a comparable scale.

POLAR and the like are not, of course, “indicative of specific individual characteristics”. The report notes that issues are well documented. But what they are is free, clean, and (reasonably) easy to work with. And it is those attributes that prove the best indicator of whether any source of data is used.

So there’s a love for UCAS data. Again it isn’t exhaustive (UCAS isn’t the only way in to university) but it is timely within a provider and fairly easy to work with. Likewise, people reach for the data in TEF workbooks. It’s just there, and it’s good enough. And there’s less keenness on bundled data like MEM and HEAT groups – the challenge of marrying up individual level (like free school meals), area level (like AMD), and provider level stats requires some serious analytic firepower – even understanding and interpreting what a compound indicator means takes time and effort. Only a few respondents went to the trouble of digging in to individual applicant data – citing difficulty of access and timeliness of data – some added school-derived data to this mix via one of three paid services.

Success and participation measures tip the other way. Here, as you might expect – the institutions own data is everything – 21 of 60 providers cited internal student management information (despite many reports of infrastructure issues) as most important to measure student success, 16 of 60 saw institutional analysis or research as most important for progression. 54 of 60 providers used the OfS A&P data – but then, they were told to… far less saw it as the most important source. (42 of 60 used TEF data, to put that in perspective).

Ease and timing is everything

The story that keeps coming through is one that will ring true for many of us. We reach for the easy to access, easy to use, easy to understand data most of the time. Novel approaches and new products and services are useful for more in-depth analysis – a particular institutional project, for example – but these seem to be the exception.

Where the access and participation dataset failed was in not meeting those three key requirements. It was difficult to work with, difficult to understand, and (for anything beyond the spreadsheet sent to every provider in England) difficult to analyse in context. You could easily compare with the sector, but if you wanted to build your own benchmarks you were out. In the words of the report:

Tensions can arise when A&P practitioners are required to work with complex datasets that might be outside their existing knowledge and expertise. The local context of culture of higher-education providers is also an important consideration, as there is evidence of resistance to data-led processes from some academic staff teams. Interpreting data analysis is also linked to issues around expertise, which frequently arise due to linked data files arriving as ‘flat’ files that require data manipulation and analytical skills in order to make sense of local outcome data.“

There is a skills gap, and one that needs to be addressed quickly if this approach to access and participation planning continues to be mandated.

A lack of expertise for data analysis and interpretation is perceived to be a more prominent barrier for evaluation activities, with just under half (46 per cent) of survey respondents perceiving this as a barrier for access activities and around two-thirds seeing it as a barrier to the evaluation of success and progression activities. Once again, smaller providers (68 per cent) perceived a lack of expertise in data analysis and interpretation to a greater extent compared with larger providers (12 per cent)”

But you have to dig to find these complaints in the report (page 79) – and it is not clear to the uninitiated that this is very much a critique of OfS’s own data practices. The regulator released an evaluation of the financial support evaluation toolkit yesterday too – I’d love to see an evaluation of the access and participation dashboard.

One response to “We need to talk about access and participation data

  1. Eloquently argued David, if a little one sided. In my view, the APP Dataset is a giant leap forward for the sector in terms of transparency – there is no hiding from these equality gaps at institutional level any more. And it only takes a few minutes read of the APP dataset accompanying guidance and worked examples to get to grips with it. I agree that the nuance is missing – but you can only really expect that with data at the student level – which clearly the OfS cannot share. Institutions can use their own data to add that nuance. But as a fairly simple to follow, no advanced Excel skills required, look at the equality state of play in access, student success and progression, I’d suggest the APP Dataset is a very welcome addition (and there’s alternative geographical proxies to the beleaguered POLAR too…)

Leave a Reply